CR2008: Cross-tab export to excel repeating column headers - crystal-reports

I've exported a cross-tab to excel (NOT data-only) and noticed that after every so many rows, it automatically repeats the column headers.
This makes it difficult to sort and filter. I don't understand why the column headers are being repeated, and there doesn't seem to be an option to disable this.
Anyone run into this issue before and managed to resolve it?
EDIT: it appears that crystal is automatically adding page breaks to the export. One solution I've found is to dissociate page size from printing page size, and then set the vertical length to be ridiculously large. Of course, the page size is still hardcoded so with sufficiently large amounts of data, the solution would still not work, but fortunately I have upper bounds on just how large the data set can be.

Did you set these Excel export options?

Related

How to remove words from a document on a column-by-column basis instead of whole lines in word

Perhaps a stupid question but I have a document where I have a large number of numerical values arranged in columns, although not in word's actual column formatting and I want to delete certain columns while leaving one intact. Heres a link to a part of my document.
Data
As can be seen there are four columns and I only want to keep the 3rd column but when I select any of this in word, it selects the whole line. Is there a way I can select data in word as a column, rather than as whole lines? If not, can this be done in other word processing programs?
Generally, spreadsheet apps or subprograms are what you need for deleting and modifying data in column or row format.
Microsoft's spreadsheet equivalent is Excel, part of the Microsoft Office Suite that Word came with. I believe Google Docs has a free spreadsheet tool online as well.
I have not looked at the uploaded file, but if it is small enough, you might be able to paste one row of data at a time into a spreadsheet, and then do your operation on the column data all at once.
There may be other solutions to this problem, but that's a start.

How to handle selection with large amounts of data in NatTable

When using the NatTable with a selection layer, if I have huge amounts (1million+) columns of data, selecting a row will take extremely long amounts of time (20 seconds+) or will outright crash my application. Is there a better way to handle selection of large amounts of data or maybe a way to select the entire amount but only visually show the amount of showing columns as selected and updating that as the table is scrolled?
It turns out that this is really a performance leak in NatTable. And interestingly it exists in that form for a long time and nobody has seen this until now.
I created a ticket [1] and work on a fix.
Until that you could try to remove or replace the "bad guys" from your composition. If that is not possible, you need to wait for the fix.
ColumnReorderLayer: if you don't need column reorder support, remove it from your layer stack (when talking about millions of columns, I suppose reordering is not a required feature)
ColumnHideShowLayer: if you don't need to support hiding columns, remove it from your layer stack. Not sure if you need it for your use case of showing millions of columns.
SelectionModel: I don't know your data model, but maybe the PreserveSelectionModel performs slightly better at the moment. Or have a look at the proposed fix attached to the ticket (once it is uploaded) and use a local version of that fix in your environment by creating a custom ISelectionModel implementation based on the fix.
[1] https://bugs.eclipse.org/bugs/show_bug.cgi?id=509685

How to automatically create new worksheet when data exceeds 65536 rows when exporting using OfficeWriter?

I have a report that exceeds 65536 rows of data. From my understanding, correct me if I'm wrong here, Officewriter template can only render this much data (65536), and the remaining rows will be removed. Is there any way to automatically create a new worksheet to the exported excel file to accommodate the remaining rows using Officewriter?
Dave,
There are a couple ways of doing this.
Use the continue Modifier. The continue modifier will let you overflow your data from one worksheet to another see the documentation here
Use the XLSX file format. Not xls. The 65536 rows you mention is a file format limitation on the xls file format. The XLSX file format easily supports over 1 million rows per a worksheet.
Lastly look at the MaxRows property on DataBindingProperties. I am going by memory and do not have OfficeWriter installed at the moment, but depending on your version of OW there may be a bug. In some versions I believe MaxRows defaults to 65536, so even if you are using XLSX it may appear to get truncated. You can work around this by setting MaxRows to a larger number and using XLSX.
Hope this helps.

What is the ideal approach to export reports to Excel and CSV using JasperReports?

I am using iReport designer to export reports into PDF format and CSV formats. Now for the PDF
format, everything seems perfect, but when I use the same design to export to CSV, the whole
layout goes haywire. I would document all the necessary research i have gathered. Let's have
a look at the report format in PDF and then CSV.
PDF Format
CSV Format
Here is the research gathered.
PDF format is pixel perfect reports where CSV reports.
We can use CSVMetaDataExporter in order to just extract the data and set the column names describing the types and data using export parameters. Though i have not used the second option still.
So my basic question is, if we want to use the same template to export CSV or Excel, we would be obviously running into alignment and width issues. I exported the report to Excel as well and in the Excel format the results were not at all satisfactory. So in this context, is JasperReports really a correct choice to opt for Excel and CSV formats? If it is, what is the ideal approach to deal with such output formats?
In my professional opinion, no. Don't even bother trying to keep the same template format when your output will change from Visual: PDF/On-Screen/Print and Structured: CSV/Excel etc,.
Alex K mentioned the Advanced Excel Features, and when used well it can generate output on screen that will match Excel. However, your design of the elements must be very tight, meaning avoid spanning cells, absolutely positioned elements, snap to grid or snap to other elements.
If your client/user requires the report to look good and be useable in Excel, then you may very well have to design for an Excel format.

Out of memory exeception for straightforward report

I'm trying to run an SSRS report. It's a straightforward report, just to render data from a table which has around 80K records.
No aggregation or data processing is done in report. There are around 50 columns along with 19 report parameters. I just have to display those 50 columns in report (no pivot).
Usually it takes around 5 minutes to render this report on our development server (off peak hours). Same is the case with our production server, but there users are getting "Out of memory" exceptions a lot, and also report parameter criteria are not utilized (that's the complaints I get from users).
I'm able to filter the criteria locally without any problem although it takes long time to render.
Why does it take such a long time to render the report, even though the report is straightforward?
The report runs fine when I hit F5 on VS 2008 but from time to time I get out of memory exceptions when I hit the "Preview" tab.
Some of the column's name(s) have a "#" character. If I include such columns in the report an "out of memory exception" is thrown (especially in Preview mode). Is there truth to this: doesn't SSRS like column names with "#"? E.g. my column name was "KLN#".
I have created a nonclustered index on the table but that didn't help me much.
Whats the difference between running the report in Preview mode vs hitting F5 on VS 2008? It's fine when I hit F5 even though it takes 5 minutes, but Preview mode has the problem.
There isn't much room for redesign (since it's a straight forward report), perhaps only can I remove of the report parameters.
Any suggestion would be appreciated.
In addition to the already posted answers and regarding the problems with the preview in the Report Designer or Report Manager there is another possible solution: avoid too much data on the first report page!
It can be done by pagination into small record amounts, i.e. by custom groups with page breaks or sometimes automatically (see the answer of done_merson) or by adding a simple cover page.
These solutions are especially helpfull in the development phase and if you plan to render the report results to Excel or PDF anyway.
I had a similar case with out of memory exceptions and never returning reports with a simple report and its dataset containing about 70k records.
The query was executed in about 1-2 minutes, but neither the Report Designer nor our development SSRS 2008R2 Server (Report Manager) could show the resulting report preview. Finally I suspected the HTML preview being the bottleneck and avoided it by adding a cover page with a simple textbox. The next report execution took about 2 minutes and successfully showed the HTML preview with the cover page. Rendering the complete result to Excel only took another 30 seconds.
Hopefully this will help others, since this page is still one of the top posts if you search for SSRS out of memory exceptions.
Why does it take such a long time to render...?
I have created a Nonclustered index on the table but that didn't help me much.
Because (AFAIK) SSRS will construct an in-memory model of the report before rendering. Know that SSRS will take three steps in creating a report:
Retrieve the data.
Create an internal model by combining the report and the data.
Render the report to the appropriate format (preview, html, xls, etc)
You can check the ExecutionLog2 View to see how much time each step takes. Step 1 is probably already reasonably fast (seconds), so the added Index is not tackling the bottle neck. Probably step 2 and 3 are taking a lot of time, and require a lot of RAM.
SSRS doesn't like column names with #?? my column name was KLN#.
As far as I know this shouldn't be a problem. Removing that column more likely was just enough to make the report runnable again.
There isn't much to redesign (since its a straight forward report) such as except i can remove of the report parameters.
SSRS is just not the right tool for this. As such, there is no real "solution" for your problem, only alternatives and workarounds.
Workarounds:
As #glh mentioned in his answer, making more RAM available for SSRS may "help".
Requiring the user to filter the data with a parameter (i.e. don't allow the user to select all those rows, only the ones he needs).
Schedule the report at a quiet moment (when there's enough RAM available) and cache the report.
Alternatives:
Create a small custom app that reads from the database and outputs an Excel.
Use SSIS, which (I thought) is better suited for this kind of task (data transformation and migration).
Rethink your setup. You haven't mentioned the context of your report, but perhaps you have an XY Problem. Perhaps your users want the entire report but only need a few key rows, or perhaps they only use it as a backup mechanism (for which there's better alternatives), or...
Try to increase you ram, see this post for a similar error:
Need SSRS matrix to show more than 400k records
We just had a similar situation and set the "Keep together on one page if possible" option in Tablix Properties / General / Page break options to off and it worked fine.