Out of memory exception in Soft Artisans Excel Writer - officewriter

I am using Soft Artisans Excel Writer to populate an excel template via binding. When my data source (data reader) exceeds about 120000 rows I get an "Out of Memory" exception from excel writer. Originally I was using a datatable as the source but switched to a reader in the believe it may have a lesser memory footprint. This does not happen all the time but often enough to be annoying. The data does have some varchar(max) in it but I have restricted all columns to 20000 characters. I am using the latest version (as at 11 Feb 2014). Does any one know of any tricks to minimise the load on excel writer? Thanks

Related

Excel template translator Jett more data auto generated sheet

Using jett template translator if there exist more data than supported in one sheet. Will more data displayed in next sheet . If yes will it include header of table.
No, it won't.
Since you're mentioning max rows per sheet limit, I assume you're working with the XLS format where every sheet is limited to 65536 rows.
Then I would strongly advise you to switch to XLSX format, since its limit is around 1 million rows per sheet.
Also, if you're inserting so much data per sheet, you should consider streaming the data directly to the sheet using SXSSF POI API. If you manipulate huge XLSX spreadsheets with hundreds of thousands of rows directly with JETT, you're likely to hit memory problems - but because of POI XSSF, not because of JETT itself.
If you're stuck with XLS format, you'll have to do the pagination and insert extra sheets yourself before inserting data with JETT.

How to export a CSV to Excel using Powershell more than 65536 rows (and 48 columns)

I'm using the code found here - How to export a CSV to Excel using Powershell
However, when I try to convert a file with say 302,123 rows (and 48 columns), it only converts up to 65536 rows. Is there a way to convert the whole file? I read it's an excel 2003 issue. I'm not sure how to get it to convert over with 2007 and up only...
It sounds like you are reaching the memory limit (theoretical 2GB virtual memory limit) for Excel 2007 which is an 32bit process.
Filesize is not the same as memory usage as Excel tries to convert the data into rows. Ex. I just converted a 650k lines CSV-file (15MB) using the linked answer in your post and my Excel 2013 64-bit reached at least 70-80MB memory before it was done. Imagine your 350+ MB CSV-file.
If you really need to convert that many rows to an excel document you should get a computer with 64-bit Office 2010 or newer.
Personally, I would recommend a SQL-database for this amount of data.

MongoDB collection size before/after dump

I have a question regarding MongoDB's collection size.
I did a small stress test in which my MongoDB server was constantly inserting, deleting and updating data for about 48 hours. The documents were only of small size, simply a numerical value and a timestamp as well as an ID.
Now, after those 48 hours, the collection used for inserting, deleting and updating data was 98.000 Bytes and the preallocated storage size was 696.320 Bytes. It has become that much higher than the actual collection size because of one input spike during an insertion phase. Due to following deletions of objects the actual collection size decreased again, the preallocated storage size didn't (AFAIK a common database management problem, since it's the same with e.g. MySQL).
After the stress test was completed I created a dump of my MongoDB database and dropped the database completely, so I could import the dump afterwards again and see how the stats would look then. And as I suspected, the collection size was still the same (98.000 Bytes) but the preallocated storage size went down to 40.960 Bytes (from 696.320 Bytes before).
Since we want to try out MongoDB for an application that produces hundreds of MB of data and therefore I/O traffic every day, we need to keep the database and its occupied space to a minimum. And preferably without having to create a dump, drop the whole database and import the dump again every now and then.
Now my question is: is there a way to call the MongoDB garbage collector functionally from code? The software behind it is a Java software and my idea was to call the garbage collector after a certain amount of time/operations or after the preallocated storage size has reached a certain threshold.
Or maybe there's an ever better (more elegant) way to minimize the occupied space?
Any help would be appreciated and I'll try to provide any further information if needed. Thanks in advance.

How to automatically create new worksheet when data exceeds 65536 rows when exporting using OfficeWriter?

I have a report that exceeds 65536 rows of data. From my understanding, correct me if I'm wrong here, Officewriter template can only render this much data (65536), and the remaining rows will be removed. Is there any way to automatically create a new worksheet to the exported excel file to accommodate the remaining rows using Officewriter?
Dave,
There are a couple ways of doing this.
Use the continue Modifier. The continue modifier will let you overflow your data from one worksheet to another see the documentation here
Use the XLSX file format. Not xls. The 65536 rows you mention is a file format limitation on the xls file format. The XLSX file format easily supports over 1 million rows per a worksheet.
Lastly look at the MaxRows property on DataBindingProperties. I am going by memory and do not have OfficeWriter installed at the moment, but depending on your version of OW there may be a bug. In some versions I believe MaxRows defaults to 65536, so even if you are using XLSX it may appear to get truncated. You can work around this by setting MaxRows to a larger number and using XLSX.
Hope this helps.

Problems exporting a 3305 page report (95000 records) using Crystal Reports 8 to RTF/Word/Excel

I'm having problems exporting a 3305 page report (95000 records) using CR 8 to RTF.
When exporting a TXT file, it works.
But...
When exporting a large RTF, the program hangs at about 42% of the export process. Later it frees up the system, appears to finish, and outputs a file. The file itself is not complete (many records missing), and the formatting is gone (everything displays vertically, one word on top of another).
My setup has Windows XP SP2; Intel Pentium CPU 2.8G; about 512 RAM.. on another machine with twice that amount it only got to 43%.
When exporting a large DOC, the Reports module hangs at about 63% of the export process. Later it frees up the system, and outputs a file. The file itself is in Word 2.0, and I cannot open it on my screen.
Excel 8 is also a no go
Upgrading CR is not an option for me at this point.
The customer wants this feature to work, and is not presently willing to filter the report and export in smaller chunks (the nature of their work requires them to have it as one single document with a single date stamp at the bottom of the page, and other reasons.).
It seems like it could be a memory issue.
I also wonder if there isn't any limits to the size of an RTF, WORD or EXCEL file. I think EXCEL is only good up to 65000+ records per worksheet.
Any ideas?
P.S. - I had a look at the other suggested topics similar to this, and did not find the answer was looking for.
I also sent an email to Crystal Reports, but I think they're now owned by another company, which I wonder is support version 8. I thought I read elsewhere they were not. Does anyone know who is still supporting version 8?
Excel (pre-2007), at least, does have a max record count, and I think it's 65386 rows (Excel 2007: 1,048,576 rows and 16,384 columns). There may be similar limitations with Word, but I would think that's unlikely, and that the limitations are a result of the exporting functionality from your version of Crystal...
Also, I'm pretty sure you're SOL with getting support from SAP (owns CR) for version 8. In my travels working with Crystal Reports (from a distance), I've seen many issues with exporting from CR that have been (recently) corrected with updates the the ExportModeler library;
Good luck with finding some help with CR8; even though you'd mentioned upgrading CR is not an option, I think it'd be your only recourse... :(
Years ago I had a problem where the temp file that the Crystal Report was generating for very large exports took up all the available space on the hard drive. Check to see how much space you have on you temp drive (usually C:). You can also watch the disk space as the export occurs to see if it is chewing up the space. It wil lmagically stall (e.g. 42% complete) when it gets down to almost zero. After the process fials, the temp file is deleted and you disk space goes back to normal.