Out of memory exeception for straightforward report - ssrs-2008

I'm trying to run an SSRS report. It's a straightforward report, just to render data from a table which has around 80K records.
No aggregation or data processing is done in report. There are around 50 columns along with 19 report parameters. I just have to display those 50 columns in report (no pivot).
Usually it takes around 5 minutes to render this report on our development server (off peak hours). Same is the case with our production server, but there users are getting "Out of memory" exceptions a lot, and also report parameter criteria are not utilized (that's the complaints I get from users).
I'm able to filter the criteria locally without any problem although it takes long time to render.
Why does it take such a long time to render the report, even though the report is straightforward?
The report runs fine when I hit F5 on VS 2008 but from time to time I get out of memory exceptions when I hit the "Preview" tab.
Some of the column's name(s) have a "#" character. If I include such columns in the report an "out of memory exception" is thrown (especially in Preview mode). Is there truth to this: doesn't SSRS like column names with "#"? E.g. my column name was "KLN#".
I have created a nonclustered index on the table but that didn't help me much.
Whats the difference between running the report in Preview mode vs hitting F5 on VS 2008? It's fine when I hit F5 even though it takes 5 minutes, but Preview mode has the problem.
There isn't much room for redesign (since it's a straight forward report), perhaps only can I remove of the report parameters.
Any suggestion would be appreciated.

In addition to the already posted answers and regarding the problems with the preview in the Report Designer or Report Manager there is another possible solution: avoid too much data on the first report page!
It can be done by pagination into small record amounts, i.e. by custom groups with page breaks or sometimes automatically (see the answer of done_merson) or by adding a simple cover page.
These solutions are especially helpfull in the development phase and if you plan to render the report results to Excel or PDF anyway.
I had a similar case with out of memory exceptions and never returning reports with a simple report and its dataset containing about 70k records.
The query was executed in about 1-2 minutes, but neither the Report Designer nor our development SSRS 2008R2 Server (Report Manager) could show the resulting report preview. Finally I suspected the HTML preview being the bottleneck and avoided it by adding a cover page with a simple textbox. The next report execution took about 2 minutes and successfully showed the HTML preview with the cover page. Rendering the complete result to Excel only took another 30 seconds.
Hopefully this will help others, since this page is still one of the top posts if you search for SSRS out of memory exceptions.

Why does it take such a long time to render...?
I have created a Nonclustered index on the table but that didn't help me much.
Because (AFAIK) SSRS will construct an in-memory model of the report before rendering. Know that SSRS will take three steps in creating a report:
Retrieve the data.
Create an internal model by combining the report and the data.
Render the report to the appropriate format (preview, html, xls, etc)
You can check the ExecutionLog2 View to see how much time each step takes. Step 1 is probably already reasonably fast (seconds), so the added Index is not tackling the bottle neck. Probably step 2 and 3 are taking a lot of time, and require a lot of RAM.
SSRS doesn't like column names with #?? my column name was KLN#.
As far as I know this shouldn't be a problem. Removing that column more likely was just enough to make the report runnable again.
There isn't much to redesign (since its a straight forward report) such as except i can remove of the report parameters.
SSRS is just not the right tool for this. As such, there is no real "solution" for your problem, only alternatives and workarounds.
Workarounds:
As #glh mentioned in his answer, making more RAM available for SSRS may "help".
Requiring the user to filter the data with a parameter (i.e. don't allow the user to select all those rows, only the ones he needs).
Schedule the report at a quiet moment (when there's enough RAM available) and cache the report.
Alternatives:
Create a small custom app that reads from the database and outputs an Excel.
Use SSIS, which (I thought) is better suited for this kind of task (data transformation and migration).
Rethink your setup. You haven't mentioned the context of your report, but perhaps you have an XY Problem. Perhaps your users want the entire report but only need a few key rows, or perhaps they only use it as a backup mechanism (for which there's better alternatives), or...

Try to increase you ram, see this post for a similar error:
Need SSRS matrix to show more than 400k records

We just had a similar situation and set the "Keep together on one page if possible" option in Tablix Properties / General / Page break options to off and it worked fine.

Related

Why is Crystal Reports Query so slow?

I have many Crystal Reports to the same database. Some execute quickly given the same date parameters and many fields are the same as well as the tables they access. One of my reports used to run quickly is now running very slow and I can see it looking through all the records - represented in the bottom 0 of 100000 til it finds records. I have no idea what I may have changed to make it do this. Some reports still run fast and some do not. These findings are consistent with the reports I am talking about. Does anyone know why setting might be causing this?
I have tried looking for any subtle differences in them - I cannot see anything. Many of them where clones from the original(still works fast).
In my CR book in the performance section it states if the where clause can not be translated it will be ignored and for the process of all records - which is what this looks like - though I have a valid where clause when I check it in the report.
Use Indexes Or Server For Speed is checked. All other setting in Report Options as identical.
Thanks
You can do some troubleshoot:
Try run your query directly on db and see how long it takes.
Is there any business logic added in your report.
May be also try to put same query in fresh report and see if it takes similar time.
Also try debug your application and see if some part of your code making your report to show slow.
Are you running it on local db or on some server.
Also if you can share your query, so I can take a look.
Let me know if you need more help.

Crystal Reports - Several Group Levels - Improve Efficiency

I've had to make some modifications to one of our reports that is comprised of ~30 sub-reports to change it from being ~4 group levels to now having ~18 group levels. It seems like it's been hit with major processing degradation. I've added group suppression as the only group levels I care about are: 1 (used to break out the data coming back into sections so the database doesn't have to be requeried multiple times thus eliminating extra database hits and extra sub-reports), 6-11 (divisional/regional/etc. data), and 18 (base level, by person).
The levels in between those I need may be needed in the future so they've been accounted for now, but since it's had such a negative impact on the performance I'd like to disable them. Right now they're suppressed but they're obviously still being processed which is killing the performance.
How can I restore the efficency without losing all the work I've done?
If a section containing a subreport is suppressed- the subreport shouldn't be executed and hence the query not executed.
From the report menu, select performance information and you will get a breakdown of what is taking the time.
If this isn't self explanatory- give us some more information and we should be able to help.

SSRS report VERY SLOW in prod but SQL query runs FAST

I've spent hours troubleshooting this and I need some fresh perspective . . .
We have a relatively simple report setup in SSRS, simple matrix with columns across the top and data points going down. The SQL query behind the report is "medium" complexity -- has some subqueries and several joins, but nothing real crazy.
Report has worked fine for months and recently has become REALLY slow. Like, 15-20 minutes to generate the report. I can clip-and-paste the SQL query from the Report Designer into SQL Mgmt Studio, replace the necessary variables, and it ruturns results in less than 2 seconds. I even went so far as to use SQL profiler to get the exact query that SSRS is executing, and clipped-and-pasted this into Mgmt Studio, still the same thing, sub-second results. The parameters and date ranges specified don't make any difference, I can set parameters to return a small dataset (< 100 rows) or a humongous one (> 10,000 rows) and still the same results; super-fast in Mgmt Studio but 20 minutes to generate the SSRS report.
Troubleshooting I've attempted so far:
Deleted and re-deployed the report in SSRS.
Tested in Visual Studio IDE on multiple machines and on the SSRS server, same speed (~20 minutes) both places
Used SQL Profiler to monitor the SPID executing the report, captured all SQL statements being executed, and tried them individualy (and together) in Mgmt Studio -- runs fast in Mgmt Studio (< 2 seconds)
Monitored server performance during report execution. Processor is pretty darn hammered during the 20 minute report generation, disk I/O is slightly above baseline
Check the execution plans for both to ensure that a combination of parameter sniffing and/or differences in set_options haven't generated two separate execution plans.
This is a scenario I've come across when executing a query from ADO.Net and from SSMS. The problem occurred when the use of different options created different execution plans. SQL Server makes use of the parameter value passed in to attempt to further optimise the execution plan generated. I found that different parameter values were used for each of the generated execution plans, resulting in both an optimal and sub-optimal plan. I can't find my original queries for checking this at the moment but a quick search reveals this article relating to the same issue.
http://www.sqlservercentral.com/blogs/sqlservernotesfromthefield/2011/10/25/multiple-query-plans-for-the-same-query_3F00_/
If you're using SQL Server 2008 there's also an alternative provided via query hint called "OPTIMIZE FOR UNKNOWN" which essentially disables parameter sniffing. Below is a link to an article that assisted my original research into this feature.
http://blogs.msdn.com/b/sqlprogrammability/archive/2008/11/26/optimize-for-unknown-a-little-known-sql-server-2008-feature.aspx
An alternative to the above for versions earlier than 2008 would be to store the parameter value in a local variable within the procedure. This would behave in the same way as the query hint above. This tip comes from the article below (in the edit).
Edit
A little more searching has unearthed an article with a very in-depth analysis of the subject in case it's of any use, link below.
http://www.sommarskog.se/query-plan-mysteries.html
This issue has been a problem for us as well. We are running SSRS reports from CRM 2011. I have tried a number of the solutions suggested (mapping input parameters to local variables, adding WITH RECOMPILE to the stored procedure) without any luck.
This article on report server application memory configuration (http://technet.microsoft.com/en-us/library/ms159206.aspx), more specifically, adding the 4000000 value to our RSReportServer.config file solved the problem.
Reports which would take 30-60 seconds to render now complete in less than 5 seconds which is about the same time the underlying stored procedure takes to execute in SSMS.

a SSRS 2008 problem:How to increase the response time

I have created a report using ssrs 2008.i have used mutiple times of grouping of data.Problem is arise when i preview the report it takes a lot of time ,sometimes more than 40 minutes.
how can i resolve it.
See if you can cache the dataset, Microsoft recommends to cache the dataset when the time taken is more then what is expected.
Also read the link http://www.simple-talk.com/sql/performance/a-performance-troubleshooting-methodology-for-sql-server/ which provides the basis for troubleshooting the Query at the database level.

crystal report & sub report

am developing a payroll project in vb.net with SQL server, in this am using SEAGATE crystal report for taking reports, if i use more than 10 sub reports in single report weather it affect my project efficiency or it will take more time
Yes, it will take more time since you're in effect probably running 10 different queries and the reporting tool is probably having to link the results of all of those queries.
I've written reports with 3 or 4 subreports, but usually more is unnecessary. I would try to think of a workaround for that many subreports - usually there's a way. (For example, use a column as a toggle for showing/hiding or grouping data.)
Actually it is hard to say, how your subreports affect performance. I've designed reports, where using subreports makes entire report run faster - sometimes it is not so easy to build underlying query to be more efficient than many simpler queries for subreports.
One example is A-B-A-C type reports, where there are many one-to-many relations from master table/query (A) to subtables/queries (B,C) AND users want to see all BC type data at once (not on-demand). For single query it would be A*B*C rows to process (and implement nasty logic to show-hide sections), using subreports you can deal with A*(B+C) total rows to process and display.
But when you use subreport to display only some total value, then often it is more efficient to aggregate it already in master view - takes less time both in server and while transmitting data. Crystal Reports formatting time is usually negligible compared to query execution time.
Like always, optimum strategy depends on particular report needs.