Crystal Reports - Several Group Levels - Improve Efficiency - crystal-reports

I've had to make some modifications to one of our reports that is comprised of ~30 sub-reports to change it from being ~4 group levels to now having ~18 group levels. It seems like it's been hit with major processing degradation. I've added group suppression as the only group levels I care about are: 1 (used to break out the data coming back into sections so the database doesn't have to be requeried multiple times thus eliminating extra database hits and extra sub-reports), 6-11 (divisional/regional/etc. data), and 18 (base level, by person).
The levels in between those I need may be needed in the future so they've been accounted for now, but since it's had such a negative impact on the performance I'd like to disable them. Right now they're suppressed but they're obviously still being processed which is killing the performance.
How can I restore the efficency without losing all the work I've done?

If a section containing a subreport is suppressed- the subreport shouldn't be executed and hence the query not executed.
From the report menu, select performance information and you will get a breakdown of what is taking the time.
If this isn't self explanatory- give us some more information and we should be able to help.

Related

How to select column which are needed to create report in tableau

I have tableau desktop. I am creating a report using 5 tables out of 5 table 2 tables are big. These tables are joined and applied filter. extract creation taking a long time (6-7 hours and still running). big tables have 100+ columns, I use only 12 columns to build my report.
Now, there is an option to use custom SQL which take less time for creating extract but then I cannot use tableau to its full potential.
any suggestion is welcome. I am looking for the name of the column I can choose for creating the extract.
Follow Process:
Make database connection
Join tables
Go to sheet and take required fields needed in report then right click on connection and create a extract then don't forget to click Hide unused fields and then apply required filtering and create a extract
This process should show you only required fields out of all fields.
Especially for very large extracts, you can also consider the option to aggregate to visible dimensions when making an extract. That can dramatically reduce the size of the extract and time to create and access it. But that option requires care to be sure you use the faster extract in a way that still gets accurate results. There are assumptions built in to that feature.
An extract is really a cached query result. If you perform aggregation when creating the extract, you can compute totals, mins, max, avg etc during extract creation, and then simply display the aggregate values in Tableau. This can save a lot of time. Of course, you can’t then further drill down past the level of detail in the extract in that case.
More importantly, if you perform further aggregation in Tableau, you have to be careful that the double aggregation gives the result you intend. Some functions are always safe — sums of sums, mins of mins, maxes of maxes always give the same answer as if you only did one large aggregation operation. These are called additive operations. Other combinations may or may not give the result you intend, averages of averages, and definitely countd of countd can be unexpected - although sometimes repeated aggregation can be well defined - averages of daily sums can make sense for example.
So performing aggregation during extract creation can lead to huge performance gains at visualization time - you effectively precompute much or all of the information you need to display. You just have to understand how it works and use accordingly. Experiment.
By the way, that feature uses the default aggregation defined for each measure in the data source. Usually SUM(). You can change that in the data pane.

Calculating and reporting Data Completeness

I have been working with measuring the data completeness and creating actionable reports for out HRIS system for some time.
Until now i have used Excel, but now that the requirements for reporting has stabilized and the need for quicker response time has increased i want to move the work to another level. At the same time i also wish there to be more detailed options for distinguishing between different units.
As an example I am looking at missing fields. So for each employee in every company I simply want to count how many fields are missing.
For other fields I am looking to validate data - like birthdays compared to hiring dates, threshold for different values, employee groups compared to responsibility level, and so on.
My question is where to move from here. Is there any language that is better than any of the others when dealing with importing lists, doing evaluations on fields in the lists and then quantify it on company and other levels? I want to be able to extract data from our different systems, then have a program do all calculations and summarize the findings in some way. (I consider it to be a good learning experience.)
I've done something like this in the past and sort of cheated. I wrote a program that ran nightly, identified missing fields (not required but necessary for data integrity) and dumped those to an incomplete record table that was cleared each night before the process ran. I then sent batch emails to each of the different groups responsible for the missing element(s) to the responsible group (Payroll/Benefits/Compensation/HR Admin) so the missing data could be added. I used .Net against and Oracle database and sent emails via Lotus Notes, but a similar design should work on just about any environment.

Out of memory exeception for straightforward report

I'm trying to run an SSRS report. It's a straightforward report, just to render data from a table which has around 80K records.
No aggregation or data processing is done in report. There are around 50 columns along with 19 report parameters. I just have to display those 50 columns in report (no pivot).
Usually it takes around 5 minutes to render this report on our development server (off peak hours). Same is the case with our production server, but there users are getting "Out of memory" exceptions a lot, and also report parameter criteria are not utilized (that's the complaints I get from users).
I'm able to filter the criteria locally without any problem although it takes long time to render.
Why does it take such a long time to render the report, even though the report is straightforward?
The report runs fine when I hit F5 on VS 2008 but from time to time I get out of memory exceptions when I hit the "Preview" tab.
Some of the column's name(s) have a "#" character. If I include such columns in the report an "out of memory exception" is thrown (especially in Preview mode). Is there truth to this: doesn't SSRS like column names with "#"? E.g. my column name was "KLN#".
I have created a nonclustered index on the table but that didn't help me much.
Whats the difference between running the report in Preview mode vs hitting F5 on VS 2008? It's fine when I hit F5 even though it takes 5 minutes, but Preview mode has the problem.
There isn't much room for redesign (since it's a straight forward report), perhaps only can I remove of the report parameters.
Any suggestion would be appreciated.
In addition to the already posted answers and regarding the problems with the preview in the Report Designer or Report Manager there is another possible solution: avoid too much data on the first report page!
It can be done by pagination into small record amounts, i.e. by custom groups with page breaks or sometimes automatically (see the answer of done_merson) or by adding a simple cover page.
These solutions are especially helpfull in the development phase and if you plan to render the report results to Excel or PDF anyway.
I had a similar case with out of memory exceptions and never returning reports with a simple report and its dataset containing about 70k records.
The query was executed in about 1-2 minutes, but neither the Report Designer nor our development SSRS 2008R2 Server (Report Manager) could show the resulting report preview. Finally I suspected the HTML preview being the bottleneck and avoided it by adding a cover page with a simple textbox. The next report execution took about 2 minutes and successfully showed the HTML preview with the cover page. Rendering the complete result to Excel only took another 30 seconds.
Hopefully this will help others, since this page is still one of the top posts if you search for SSRS out of memory exceptions.
Why does it take such a long time to render...?
I have created a Nonclustered index on the table but that didn't help me much.
Because (AFAIK) SSRS will construct an in-memory model of the report before rendering. Know that SSRS will take three steps in creating a report:
Retrieve the data.
Create an internal model by combining the report and the data.
Render the report to the appropriate format (preview, html, xls, etc)
You can check the ExecutionLog2 View to see how much time each step takes. Step 1 is probably already reasonably fast (seconds), so the added Index is not tackling the bottle neck. Probably step 2 and 3 are taking a lot of time, and require a lot of RAM.
SSRS doesn't like column names with #?? my column name was KLN#.
As far as I know this shouldn't be a problem. Removing that column more likely was just enough to make the report runnable again.
There isn't much to redesign (since its a straight forward report) such as except i can remove of the report parameters.
SSRS is just not the right tool for this. As such, there is no real "solution" for your problem, only alternatives and workarounds.
Workarounds:
As #glh mentioned in his answer, making more RAM available for SSRS may "help".
Requiring the user to filter the data with a parameter (i.e. don't allow the user to select all those rows, only the ones he needs).
Schedule the report at a quiet moment (when there's enough RAM available) and cache the report.
Alternatives:
Create a small custom app that reads from the database and outputs an Excel.
Use SSIS, which (I thought) is better suited for this kind of task (data transformation and migration).
Rethink your setup. You haven't mentioned the context of your report, but perhaps you have an XY Problem. Perhaps your users want the entire report but only need a few key rows, or perhaps they only use it as a backup mechanism (for which there's better alternatives), or...
Try to increase you ram, see this post for a similar error:
Need SSRS matrix to show more than 400k records
We just had a similar situation and set the "Keep together on one page if possible" option in Tablix Properties / General / Page break options to off and it worked fine.

crystal report & sub report

am developing a payroll project in vb.net with SQL server, in this am using SEAGATE crystal report for taking reports, if i use more than 10 sub reports in single report weather it affect my project efficiency or it will take more time
Yes, it will take more time since you're in effect probably running 10 different queries and the reporting tool is probably having to link the results of all of those queries.
I've written reports with 3 or 4 subreports, but usually more is unnecessary. I would try to think of a workaround for that many subreports - usually there's a way. (For example, use a column as a toggle for showing/hiding or grouping data.)
Actually it is hard to say, how your subreports affect performance. I've designed reports, where using subreports makes entire report run faster - sometimes it is not so easy to build underlying query to be more efficient than many simpler queries for subreports.
One example is A-B-A-C type reports, where there are many one-to-many relations from master table/query (A) to subtables/queries (B,C) AND users want to see all BC type data at once (not on-demand). For single query it would be A*B*C rows to process (and implement nasty logic to show-hide sections), using subreports you can deal with A*(B+C) total rows to process and display.
But when you use subreport to display only some total value, then often it is more efficient to aggregate it already in master view - takes less time both in server and while transmitting data. Crystal Reports formatting time is usually negligible compared to query execution time.
Like always, optimum strategy depends on particular report needs.

peformance efficiency in global temporary tables vs. normal tables - oracle 10g

I am using large number of global temporary tables for generating huge reports against an Oracle 10g database. Each report consists of 4 to 5 global temporary tables(GTT) per say. But as far as I understand the concept of GTT's, the data is created on the fly per each session for different set of parameters.
For example, in my scenario, 20 users generates the report for, say last month of sales data and it can lead to upto 1000 executions on total per day. But if we assume that the user queries most recent sales data more frequently, than how can we use some cache memory to store for a range of sales data that is queried more frequently like a internet browser does??. Also any other suggestions for fine tuning the GTT's would be very helpful.
It sounds like you are over-using GTTs. They are not normally needed very often in Oracle queries - in contrast to SQL Server where (I have read) it is more common and appropriate to use temporary tables. Without knowing your requirements in detail it is hard to recommend an approach, but materialized views are one way of "caching" query results once and using them many times.