Can Doxygen/Graphviz generate a PNG larger than 32766px? - doxygen

I am trying to generate a doxygen package with one enormous class hierarchy. (It's for QuickFIX, FWIW). No matter what I do, it seems to be capping the height of the image at 32766:
$ file html/inherit__graph__23.png
html/inherit__graph__23.png: PNG image data, 307 x 32766, 8-bit/color RGBA, non-interlaced
It's not clipping; it's scaling. The result is that at only 307px wide, the class boxes are scaled so small that the text inside them is not readable, and the HTML map doesn't work, either.
Neither the dot nor the doxygen documentation mention this limit, though it seems clear something is doing it, and I can't find any directives to override it. (And yes, I realize an image that big has its own problems in browsers, but I'll deal with that later.) That number seems suspicious due to its proximity to 2^15, and I believe PNG uses a 32-bit size field, so something bigger should be possible.
Anyone know where that limit is coming from and how to bypass it?
Edited to add: doxygen version = 1.6.1, graphviz = 2.26.0. Maybe too old?

Looks like I'm screwed. The 32K limit is imposed by cairo, which is what graphviz uses underneath the hood to render PNG.
Reference: http://comments.gmane.org/gmane.comp.lib.cairo/21068

Unfortunately, you're correct. It's too old; you're not going to be able to exceed that limit unless you manage to upgrade to a newer version.

Related

How to prevent automatic scaling under 96 dpi

In NatTable version 2 an autoscale was added while the tables are being created, supported by default DPI converters: DefaultHorizontalDpiConverter, DefaultVerticalDpiConverter. In version 1 everything under 96 DPI was not scaled down, however, now in version 2 for lower DPIs NatTables are scaled down hence images look ugly, fonts are ok:
72dpi - not ok:
96dpi - ok:
What would be the simplest way to prevent default scaling under 96 DPI?
The feature that was added with NatTable 2.0 is a complete dynamic scaling and the full support for all DPI. The following blog post should give some more details NatTable – dynamic scaling enhancements
Actually I wonder what is "ugly" with lower DPIs. At least on modern displays it should not be an issue. The only thing I could think of are the images. But typically downscaling doesn't make images ugly. So it would be really interesting to know what the issue is.
You have two options to handle that:
As you can scale NatTable dynamically at runtime, you can simply execute the ConfigureScalingCommand to force the scaling you want. Note that this needs to be done AFTER NatTable#configure().
if (Display.getDefault().getDPI().x < 96) {
natTable.doCommand(
new ConfigureScalingCommand(new FixedScalingDpiConverter(96)));
}
If you even want to block lower scalings on the dynamic scaling, you can implement a custom ConfigureScalingCommandHandler that checks for the dpiFactor in the IDpiConverter and if that is lower than 1 register a FixedScalingDpiConverter on the SizeConfigs. That custom ConfigureScalingCommandHandlerthen needs to be registered on the DataLayer to replace the default.
The second approach is probably a bit more complicated and needs a better understanding of NatTable internals. And it blocks the dynamic scaling feature to really zoom out on huge tables. So it depends on your use cases which approach to use. Typically the first option should be sufficient.
BTW, if the images are the issue, changing the scaling at runtime without re-registering the images could also cause a rendering issue. The reason for this is that images are stored in the ImageRegistry and they need to be updated there in case of scaling changes. For approach 1. that means to register all images in the ConfigRegistry again after the ConfigureScalingCommand. At least if you are not using themes or CSS styling.

Changing the compression format of multiple textures

I'm using the following method to compress a bunch of textures:
public void OnPostprocessTexture (Texture2D t) {
EditorUtility.CompressTexture(t, TextureFormat.DXT5, 2);
}
The idea is to try to compress the texture when importing it. I have a project with many textures which are not using an optimal format.
The problem is that these changes are not saved anywhere, if you check the editor you'll see that the format remains the same. I can leave the script there and reimport everything in the build server, but I'd like a way to save these changes.
The only way I can think of this is to create another texture using the format I want and copying/replace the texture. Is there a better way to do this?
Edit
Doing some more tests, I noticed something strange: EditorUtility.CompressTexture is somehow compressing NPOT textures. This is before running the script:
And this is after running EditorUtility.CompressTexture:
How does this work?
While Sergey's answer helps me change the format, it won't compress NPOT textures and I really need to save these bytes.
You have 2 problems here.
Problem 1
You are trying to do it inside OnPostprocessTexture method, which is too late. You need to do it inside OnPreprocessTexture method instead.
Problem 2
EditorUtility.CompressTexture compresses a texture object itself (an object that resides in RAM), not a corresponding asset (an object that resides on disk).
The correct way of doing what you want to do is using TextureImporter.textureFormat.
Solution
Here is a working example:
public void OnPreprocessTexture()
{
TextureImporter textureImporter = assetImporter as TextureImporter;
textureImporter.textureFormat = TextureImporterFormat.DXT5;
}
Answer to comment
Another detail: I don't agree that CompressTexture creates an object in RAM instead of the disk, I think it creates a file inside the Library folder (that at some point is loaded into the RAM).
No, it doesn't create anything inside Library folder. It's quite easy to check:
Start with minimal Unity3D project: only one texture and only one editor script (a descendant of AssetPostprocessor with your OnPostprocessTexture method.
Open Library folder and write down number of files and their size.
Reimport the texture from Unity editor (it will result in execution of your OnPostprocessTexture method).
Open Library folder and see that no files were added and total size of files remains the same (at least this is how it works on my machine).
Answer to edited question
Doing some more tests, I noticed something strange: EditorUtility.CompressTexture is somehow compressing NPOT textures.
If I try to reproduce it, Unity outputs an error line to console: "!hasError".
Despite that Unity editor says that the texture is compressed with DXT5 after such a manipulations, the actual texture format is D3DFMT_A8R8G8B8 when you check it in a plugin.
One thing to note here. Actually, DXT compression doesn't require textures to be of power of two size. It requires them to be of size of multiple of 4. Hope this helps :)

Customising Zurb Foundation 5 Grid, Gutter and Max width

I've only been able to find one or two references to the maximum allowed column we can add to a customized version of Foundation5 and these claim the number is 16. However I can find no official documentation of this.
In the Foundation 5 download customizer there are input fields for '# of Columns', 'Gutter (em)', and 'Max-Width (em)'.
I have tested with a value of up to 24 columns and the CSS output has classes included for up to 24 columns, so I'd assume the 16 column restriction no longer applies (or only applies to older versions of Foundation).
So my questions are:
Does anyone know what the max number of columns allowed is / or if others are using a particular number successfully?
If I custom download Foundation5 with more columns, how do I know what gutter width and Max-Width values will work out correctly or can I just use anything (the downloader doesn't give any indication of incorrect values or out of range values)?
Having a custom download, how would I best manage Foundation updates if I wanted to upgrade the CSS to future newer versions?
PS: Im not using SASS or other pre-processor tools.
I am afraid that the only way I can see doing this easily is by using Sass.
If you look on the grid docs page you can adjust the _setting.scss thus:
$row-width: rem-calc(1000);
$column-gutter: rem-calc(30);
$total-columns: 12 ;
Foundation Grid docs
When you say upgrade the CSS to future newer versions do you mean newer versions of Foundation? If you did a customised version download then you would probably need to do it again. If you are also do other styling changes on your site outside of the Foundation stuff and you are not going to use Sass, I think your best option would be to keep them on a separate stylesheet.
Without sounding like a Sass evangelist...it is really worth your while to use it. Makes life so much easier!
There is a CSS customizer that lets you choose up to 100 columns http://foundation.zurb.com/develop/download.html
Also you can change it in Sass to any number.

Matlab tiff setTag number not recognized

I am trying to change the value of a tag from a TIFF object in my matlab code. I keep getting this error:
Error using tifflib
Tag number (273) is unrecognized by the TIFF library.
Error in Tiff/setTag (line 1146)
tifflib('setField',obj.FileID, ...
The code I am using is included below:
fname='C:\FileLocation\pcd144_012.tif';
t=Tiff(fname,'r+');
t.getTag('StripOffsets')
t.setTag('StripOffsets',[8, 16392])
Why is it I can get the tag and see it, but cannot set the tag to a different value?
Here is a link to the tiff I am working with:
Tiff Data
I think that you're out of luck with this approach. The setTag methods are mostly used when building a TIFF from scratch. My guess is that the 'StripOffsets' field is not modifiable. Keep in mind that these tools are designed for the normal case of non-broken image files and that changing this field in such cases would either break the file or necessitate re-encoding of the data most of the time. The function should give better feedback (documentation for the TIFF could be better in general) so you might still contact The MathWorks to let them know about this.
As far as finding a way to edit these tags/fields, you might look for and try out some TIFF tag viewer/editor programs to see if they might do it. Otherwise it may come down to parsing the header yourself to find the relevant bytes.

GWT CSSResources limit of css file size 65535 bytes limit

is GWT CSSResources limits css file size should not be more than 65535 bytes. why so?
This is more of a Java limitation than GWT. From java perspective the work around is splitting up the methods.
However for GWT you just need to split up your Client Bundle contents into smaller chunks.
Ideally GWT should have handled this for you by chunking up the file you are using in Client bundle. But since this is a corner case i guess you might as well log a bug. Similar Bugs list in GWT.
Also, your immediate solution would be to use alternative
1) If the text contents can be split across multiple files. Do it!!!!
2) If text contents are changing and hence cannot be split , avoid using Client Bundle.