am developing a payroll project in vb.net with SQL server, in this am using SEAGATE crystal report for taking reports, if i use more than 10 sub reports in single report weather it affect my project efficiency or it will take more time
Yes, it will take more time since you're in effect probably running 10 different queries and the reporting tool is probably having to link the results of all of those queries.
I've written reports with 3 or 4 subreports, but usually more is unnecessary. I would try to think of a workaround for that many subreports - usually there's a way. (For example, use a column as a toggle for showing/hiding or grouping data.)
Actually it is hard to say, how your subreports affect performance. I've designed reports, where using subreports makes entire report run faster - sometimes it is not so easy to build underlying query to be more efficient than many simpler queries for subreports.
One example is A-B-A-C type reports, where there are many one-to-many relations from master table/query (A) to subtables/queries (B,C) AND users want to see all BC type data at once (not on-demand). For single query it would be A*B*C rows to process (and implement nasty logic to show-hide sections), using subreports you can deal with A*(B+C) total rows to process and display.
But when you use subreport to display only some total value, then often it is more efficient to aggregate it already in master view - takes less time both in server and while transmitting data. Crystal Reports formatting time is usually negligible compared to query execution time.
Like always, optimum strategy depends on particular report needs.
Related
I am working on SSRS report and I have some column values to be concat while displaying in the report. So does to advisable to do that at report end or I have to do it at SQL query and bind that value directly to the report.
I am having 4 columns that I have to concat to single column while binding it to report.
there are three different ways to do that,
Can do at SQL query to get combined column.
Can create expression while binding dataset to tablix.
Create a calculated field in dataset and bind that to my tablix.
from above three which one is advisable to get better performance.
This question is very broad but let me put it this way.
If you put business rules in the database then they can be consistently reused by many things beyond SSRS, for example, Excel, Power BI, data extracts
The downside is that it is often more technically difficult to consistently apply rules at a lower level like this. In other words you need a SQL Developer to do this properly rather than if you did the calc in SSRS, in which case you just need a SSRS developer.
So if you have a team full of SSRS developers, then it's going to be easier to create and maintain rules in SSRS, but the downside is these rules can't be reused by anything else.
Short answer: do it in a view in the database unless this is going to be difficult to maintain because your team doesn't have any SQL skills.
I have many Crystal Reports to the same database. Some execute quickly given the same date parameters and many fields are the same as well as the tables they access. One of my reports used to run quickly is now running very slow and I can see it looking through all the records - represented in the bottom 0 of 100000 til it finds records. I have no idea what I may have changed to make it do this. Some reports still run fast and some do not. These findings are consistent with the reports I am talking about. Does anyone know why setting might be causing this?
I have tried looking for any subtle differences in them - I cannot see anything. Many of them where clones from the original(still works fast).
In my CR book in the performance section it states if the where clause can not be translated it will be ignored and for the process of all records - which is what this looks like - though I have a valid where clause when I check it in the report.
Use Indexes Or Server For Speed is checked. All other setting in Report Options as identical.
Thanks
You can do some troubleshoot:
Try run your query directly on db and see how long it takes.
Is there any business logic added in your report.
May be also try to put same query in fresh report and see if it takes similar time.
Also try debug your application and see if some part of your code making your report to show slow.
Are you running it on local db or on some server.
Also if you can share your query, so I can take a look.
Let me know if you need more help.
On hand is a requirement for a report that needs to perform a substring operation and grouping on the chopped strings in a column. For example, consider my over-simplified scenario:
Among others, I have a column called FileName, which may have values like this
NWSTMT201308201230_STMTA
NWSTMT201308201230_STMTB
NWSTMT201308201230_STMTC
etc.
The report I'm working on should do the grouping on the values before the _ sign.
Assuming the volume of data is large, where is the best place to do the substring & grouping - in the Stored procedure or return the raw data and do all the work in SSRS?The expectation is to have good performance and maintainability.
As you mention, there are a few different possibilities. There's no correct answer for this, but certainly each method has advantages and disadvantages.
My take on the options:
On the SQL server: as a computed column in a view.
Pro: Easy to reuse if the query will be used by multiple reports or other queries.
Con: Very poor language for string manipulation.
On the SQL Server: Query embedded in the report, calculation still in query. Similar to 1, but now you lose the advantage of reuse.
Pro: report is very portable: changes can be tested against production data without disturbing current production reports.
Con: Same as 1, string manipulation in SQL is no fun. Less centralized, so possibly harder to maintain.
In the report, in formulas where required. Many disadvantages to this method, but one advantage:
Pro: It's easy to write.
Con: Maintenance is very difficult; finding all occurences of a formula can be a pain. Limited to VB Script-like commands. Editor in SSRS Authoring environment is no fun, and lacks many basic code editing features.
In the report, in the centralized code for the report.
Pro: VB.NET syntax, global variables, easy maintenance, with centralized code per report.
Con: VB.NET Syntax (I greatly prefer C#.) Editor is no better than the formula windows. You'll probably still end up writing this in another window and cutting and pasting to its destination.
Custom .NET assembly: compiled as a .dll, and called from the report.'
Pro: Use any .NET language, full Visual Studio editor support, along with easy source control and centralization of code.
Con: More finicky to get set up, report deployment will require a .dll deployed to the SSRS Server.
So my decision process for this is something like:
Is this just a one time, easy formula? Use method 3.
Is this cleanly expressed in SQL and only used in one report? Method 2.
Cleanly expressed in SQL and used in multiple reports or queries? Method 1.
Better expressed in Visual Basic than SQL? Method 4.
Significant development effort going into this with multiple developers? Method 5.
Too often I'll start following method 3, and then realize I've used the formula too many places and I should have centralized earlier. Also, our team is pretty familiar with SQL, so that pushes towards the first two options more than some shops might be.
I'd put performance concerns second unless you know that you have a problem. Putting this code in SQL can sometimes pay off, but if you aren't careful, you can end up calling things excessively on results that are ultimately filtered out.
I'm trying to run an SSRS report. It's a straightforward report, just to render data from a table which has around 80K records.
No aggregation or data processing is done in report. There are around 50 columns along with 19 report parameters. I just have to display those 50 columns in report (no pivot).
Usually it takes around 5 minutes to render this report on our development server (off peak hours). Same is the case with our production server, but there users are getting "Out of memory" exceptions a lot, and also report parameter criteria are not utilized (that's the complaints I get from users).
I'm able to filter the criteria locally without any problem although it takes long time to render.
Why does it take such a long time to render the report, even though the report is straightforward?
The report runs fine when I hit F5 on VS 2008 but from time to time I get out of memory exceptions when I hit the "Preview" tab.
Some of the column's name(s) have a "#" character. If I include such columns in the report an "out of memory exception" is thrown (especially in Preview mode). Is there truth to this: doesn't SSRS like column names with "#"? E.g. my column name was "KLN#".
I have created a nonclustered index on the table but that didn't help me much.
Whats the difference between running the report in Preview mode vs hitting F5 on VS 2008? It's fine when I hit F5 even though it takes 5 minutes, but Preview mode has the problem.
There isn't much room for redesign (since it's a straight forward report), perhaps only can I remove of the report parameters.
Any suggestion would be appreciated.
In addition to the already posted answers and regarding the problems with the preview in the Report Designer or Report Manager there is another possible solution: avoid too much data on the first report page!
It can be done by pagination into small record amounts, i.e. by custom groups with page breaks or sometimes automatically (see the answer of done_merson) or by adding a simple cover page.
These solutions are especially helpfull in the development phase and if you plan to render the report results to Excel or PDF anyway.
I had a similar case with out of memory exceptions and never returning reports with a simple report and its dataset containing about 70k records.
The query was executed in about 1-2 minutes, but neither the Report Designer nor our development SSRS 2008R2 Server (Report Manager) could show the resulting report preview. Finally I suspected the HTML preview being the bottleneck and avoided it by adding a cover page with a simple textbox. The next report execution took about 2 minutes and successfully showed the HTML preview with the cover page. Rendering the complete result to Excel only took another 30 seconds.
Hopefully this will help others, since this page is still one of the top posts if you search for SSRS out of memory exceptions.
Why does it take such a long time to render...?
I have created a Nonclustered index on the table but that didn't help me much.
Because (AFAIK) SSRS will construct an in-memory model of the report before rendering. Know that SSRS will take three steps in creating a report:
Retrieve the data.
Create an internal model by combining the report and the data.
Render the report to the appropriate format (preview, html, xls, etc)
You can check the ExecutionLog2 View to see how much time each step takes. Step 1 is probably already reasonably fast (seconds), so the added Index is not tackling the bottle neck. Probably step 2 and 3 are taking a lot of time, and require a lot of RAM.
SSRS doesn't like column names with #?? my column name was KLN#.
As far as I know this shouldn't be a problem. Removing that column more likely was just enough to make the report runnable again.
There isn't much to redesign (since its a straight forward report) such as except i can remove of the report parameters.
SSRS is just not the right tool for this. As such, there is no real "solution" for your problem, only alternatives and workarounds.
Workarounds:
As #glh mentioned in his answer, making more RAM available for SSRS may "help".
Requiring the user to filter the data with a parameter (i.e. don't allow the user to select all those rows, only the ones he needs).
Schedule the report at a quiet moment (when there's enough RAM available) and cache the report.
Alternatives:
Create a small custom app that reads from the database and outputs an Excel.
Use SSIS, which (I thought) is better suited for this kind of task (data transformation and migration).
Rethink your setup. You haven't mentioned the context of your report, but perhaps you have an XY Problem. Perhaps your users want the entire report but only need a few key rows, or perhaps they only use it as a backup mechanism (for which there's better alternatives), or...
Try to increase you ram, see this post for a similar error:
Need SSRS matrix to show more than 400k records
We just had a similar situation and set the "Keep together on one page if possible" option in Tablix Properties / General / Page break options to off and it worked fine.
I've had to make some modifications to one of our reports that is comprised of ~30 sub-reports to change it from being ~4 group levels to now having ~18 group levels. It seems like it's been hit with major processing degradation. I've added group suppression as the only group levels I care about are: 1 (used to break out the data coming back into sections so the database doesn't have to be requeried multiple times thus eliminating extra database hits and extra sub-reports), 6-11 (divisional/regional/etc. data), and 18 (base level, by person).
The levels in between those I need may be needed in the future so they've been accounted for now, but since it's had such a negative impact on the performance I'd like to disable them. Right now they're suppressed but they're obviously still being processed which is killing the performance.
How can I restore the efficency without losing all the work I've done?
If a section containing a subreport is suppressed- the subreport shouldn't be executed and hence the query not executed.
From the report menu, select performance information and you will get a breakdown of what is taking the time.
If this isn't self explanatory- give us some more information and we should be able to help.