Problems exporting a 3305 page report (95000 records) using Crystal Reports 8 to RTF/Word/Excel - crystal-reports

I'm having problems exporting a 3305 page report (95000 records) using CR 8 to RTF.
When exporting a TXT file, it works.
But...
When exporting a large RTF, the program hangs at about 42% of the export process. Later it frees up the system, appears to finish, and outputs a file. The file itself is not complete (many records missing), and the formatting is gone (everything displays vertically, one word on top of another).
My setup has Windows XP SP2; Intel Pentium CPU 2.8G; about 512 RAM.. on another machine with twice that amount it only got to 43%.
When exporting a large DOC, the Reports module hangs at about 63% of the export process. Later it frees up the system, and outputs a file. The file itself is in Word 2.0, and I cannot open it on my screen.
Excel 8 is also a no go
Upgrading CR is not an option for me at this point.
The customer wants this feature to work, and is not presently willing to filter the report and export in smaller chunks (the nature of their work requires them to have it as one single document with a single date stamp at the bottom of the page, and other reasons.).
It seems like it could be a memory issue.
I also wonder if there isn't any limits to the size of an RTF, WORD or EXCEL file. I think EXCEL is only good up to 65000+ records per worksheet.
Any ideas?
P.S. - I had a look at the other suggested topics similar to this, and did not find the answer was looking for.
I also sent an email to Crystal Reports, but I think they're now owned by another company, which I wonder is support version 8. I thought I read elsewhere they were not. Does anyone know who is still supporting version 8?

Excel (pre-2007), at least, does have a max record count, and I think it's 65386 rows (Excel 2007: 1,048,576 rows and 16,384 columns). There may be similar limitations with Word, but I would think that's unlikely, and that the limitations are a result of the exporting functionality from your version of Crystal...
Also, I'm pretty sure you're SOL with getting support from SAP (owns CR) for version 8. In my travels working with Crystal Reports (from a distance), I've seen many issues with exporting from CR that have been (recently) corrected with updates the the ExportModeler library;
Good luck with finding some help with CR8; even though you'd mentioned upgrading CR is not an option, I think it'd be your only recourse... :(

Years ago I had a problem where the temp file that the Crystal Report was generating for very large exports took up all the available space on the hard drive. Check to see how much space you have on you temp drive (usually C:). You can also watch the disk space as the export occurs to see if it is chewing up the space. It wil lmagically stall (e.g. 42% complete) when it gets down to almost zero. After the process fials, the temp file is deleted and you disk space goes back to normal.

Related

How to clean files easily?

I am a Bi developer in Microsoft Sql server for a while now.
I have always worked with ether an almost clean data which is excel files with first row is the headings that does have too much rubbish in sheets(like irrelevant data, calculations and so on), text files with data separated by a comma (csv files)
Or with relatively small amounts of files which I cleaned manually(it wasn't an issue)
In my new job I am getting many not clean files, examples are: plain text files (not csv) and excel files the opposite of the mentioned above.
My problem is that these files are many and going through every file is upsetting (opening cleaning manually and trying to make any sense of the data within) so finally I can load it to an integration service tool (ssis, Informatica) and then to a Viz tool through a data warehouse.
Viz tool like Tableau desktop can't clean them appropriately with the automatic interpretation (it takes only the main tables and ignore the others with these not clean files)
I am sure someone worked with these things, your help would be appreciated!
How to deal with these situations?

Best way to report the growth of a file using powershell?

I would like to report the database size to myself via email every week and make a comparison to the week before and display the growth in Megabyte and/or %.
I have everything besides the comparison done.
Imagine this setup :
SQL server with 100 databases
Now there are plenty of ways to do a comparison, I thought about writing the sizes into XML by powershell and later read out using a second script and report to me.
Since I trained myself in powershell I might have gaps here, so I am afraid to miss an easy way.
Does anyone has a nice Idea of how to compare the size?
The report and calculation I will manage myself later, I just need a good way to do that.
Currently I am on Powershell 3.0 but I can upgrade to 4.0
Don't invent the wheel again. Sql Server already has tools to monitor DB file sizes. So does Performance Monitor. There are several 3rd party products available too. Ask your local DBA if there already is such a system present.
A common practice is to query the server for DB file sizes on, say, daily basis and store it in utility db table with timestamp. Calculating change volumes, ratios and whatnot can be done on TSQL side. (Not that it is CPU intensive anyway.)
I would creat foreach database an csv file. then write out two rows:
Date,Size
27.08.2014,1024
28.08.2014,1040
29.08.2014,1080
Then you can import the csv file, sort the row by date, compare the last two sizes and send the result by mail.

Out of memory exeception for straightforward report

I'm trying to run an SSRS report. It's a straightforward report, just to render data from a table which has around 80K records.
No aggregation or data processing is done in report. There are around 50 columns along with 19 report parameters. I just have to display those 50 columns in report (no pivot).
Usually it takes around 5 minutes to render this report on our development server (off peak hours). Same is the case with our production server, but there users are getting "Out of memory" exceptions a lot, and also report parameter criteria are not utilized (that's the complaints I get from users).
I'm able to filter the criteria locally without any problem although it takes long time to render.
Why does it take such a long time to render the report, even though the report is straightforward?
The report runs fine when I hit F5 on VS 2008 but from time to time I get out of memory exceptions when I hit the "Preview" tab.
Some of the column's name(s) have a "#" character. If I include such columns in the report an "out of memory exception" is thrown (especially in Preview mode). Is there truth to this: doesn't SSRS like column names with "#"? E.g. my column name was "KLN#".
I have created a nonclustered index on the table but that didn't help me much.
Whats the difference between running the report in Preview mode vs hitting F5 on VS 2008? It's fine when I hit F5 even though it takes 5 minutes, but Preview mode has the problem.
There isn't much room for redesign (since it's a straight forward report), perhaps only can I remove of the report parameters.
Any suggestion would be appreciated.
In addition to the already posted answers and regarding the problems with the preview in the Report Designer or Report Manager there is another possible solution: avoid too much data on the first report page!
It can be done by pagination into small record amounts, i.e. by custom groups with page breaks or sometimes automatically (see the answer of done_merson) or by adding a simple cover page.
These solutions are especially helpfull in the development phase and if you plan to render the report results to Excel or PDF anyway.
I had a similar case with out of memory exceptions and never returning reports with a simple report and its dataset containing about 70k records.
The query was executed in about 1-2 minutes, but neither the Report Designer nor our development SSRS 2008R2 Server (Report Manager) could show the resulting report preview. Finally I suspected the HTML preview being the bottleneck and avoided it by adding a cover page with a simple textbox. The next report execution took about 2 minutes and successfully showed the HTML preview with the cover page. Rendering the complete result to Excel only took another 30 seconds.
Hopefully this will help others, since this page is still one of the top posts if you search for SSRS out of memory exceptions.
Why does it take such a long time to render...?
I have created a Nonclustered index on the table but that didn't help me much.
Because (AFAIK) SSRS will construct an in-memory model of the report before rendering. Know that SSRS will take three steps in creating a report:
Retrieve the data.
Create an internal model by combining the report and the data.
Render the report to the appropriate format (preview, html, xls, etc)
You can check the ExecutionLog2 View to see how much time each step takes. Step 1 is probably already reasonably fast (seconds), so the added Index is not tackling the bottle neck. Probably step 2 and 3 are taking a lot of time, and require a lot of RAM.
SSRS doesn't like column names with #?? my column name was KLN#.
As far as I know this shouldn't be a problem. Removing that column more likely was just enough to make the report runnable again.
There isn't much to redesign (since its a straight forward report) such as except i can remove of the report parameters.
SSRS is just not the right tool for this. As such, there is no real "solution" for your problem, only alternatives and workarounds.
Workarounds:
As #glh mentioned in his answer, making more RAM available for SSRS may "help".
Requiring the user to filter the data with a parameter (i.e. don't allow the user to select all those rows, only the ones he needs).
Schedule the report at a quiet moment (when there's enough RAM available) and cache the report.
Alternatives:
Create a small custom app that reads from the database and outputs an Excel.
Use SSIS, which (I thought) is better suited for this kind of task (data transformation and migration).
Rethink your setup. You haven't mentioned the context of your report, but perhaps you have an XY Problem. Perhaps your users want the entire report but only need a few key rows, or perhaps they only use it as a backup mechanism (for which there's better alternatives), or...
Try to increase you ram, see this post for a similar error:
Need SSRS matrix to show more than 400k records
We just had a similar situation and set the "Keep together on one page if possible" option in Tablix Properties / General / Page break options to off and it worked fine.

perl module for writing excel2007 workbook

I have a huge report coming out of a tool from which i extract the some important data and write a excel file. Till now i used the module Spreadsheet::WriteExcel, but it crashed when the number of rows exceed 65535. Is there any other module which supports generating excel for huge data? I checked in CPAN, found modules for reading excel2007 files but couldnt find one for writing. I am not writing a csv because, i want to generate multiple worksheets in the excel file.
Excel::Writer::XLSX is an API compatible replacement for Spreadsheet::WriteExcel that supports the Excel 2007 xlsx format and the increased row/column limits.
If you check the bugs link on the Spreadsheet::WriteExcel cpan site, you will notice there is an defect which is open for more than 11 months for this exact issue.
Bug ID: 54902
That said, can't you instead try writing in chunks of 65536 rows in each spreadsheet and later collate it?

tfs database size - version control

I have TFS installed on a single server and am running out of space on the disk. (We've been using the instance for about 2 years now.)
Looking at the tables in SQL Server what seems to be culprit is the tbl_content table, it is at 70 GB. If I do a get on the entire source tree for all projects it is only about 8 GB of data.
Is this just all the histories of the files? It seems like a 10:1 ratio just the histories...since I would think the deltas would be very small.
Does anyone know if that is a reasonable size given 8 GB of source (and 2 yrs of activity)? And if not what to look at to 'fix' this?
Thanks
I can't help with the ratio question at the moment, sorry. For a short-term fix you might check to see if there is any space within the DB files that can be freed up. You may have already, but if not..
SELECT name ,size/128.0 - CAST(FILEPROPERTY(name, 'SpaceUsed') AS int)/128.0 AS AvailableSpaceInMB
FROM sys.database_files;
If the statement above returns some space you want to recover you can look into a one time DBCC SHRINKDATABASE or DBCC SHRINKFILE along with scheduling routine SQL maintenance plan that may include defragmenting the database.
DBCC SHRINKDATABASE and DBCC SHRINKFILE aren't things you should do on a regular basis, because SQL Server needs some "swap" space to move things around for optimal performance. So neither should be relied upon as your long term fix, and both could cause some noticeable performance degradation of TFS response times.
JB
Are you seeing data growth every day, even when no activity occurs on the system? If the answer is yes, are you storing any binaries outside of the 8GB of source somewhere?
The reason that I ask is that if TFS is unable to calculate a delta or if the file exceeds the size of delta generation, TFS will duplicate the entire binary file. I don't have the link with me, but I have it on my work machine, which describes this scenario and how to fix it, in the event that this is the cause of your problems.