Loading unicode font in OpenGL ES App - iphone

I wish to load a unicode font in my openGL ES app using freetype library. I initially considered using Arial Unicode MS, but it is too big, around 24 MB.
Is there any other unicode font available of smaller size? I also understand that some other unicode font might not be small enough to solve my problem. Is there any alternative approach to solve my issue?

I think you're referring to texture mapped fonts. As you already figured, a full Unicode font consumes rather much space. However changing the font won't make a difference as the memory requirements don't depend on the typeface, but on the resolution the glyphs rendered to a texture atlas times the amount of glyphs that are included into the set.
While texture mapped fonts were a viable option for alphabets with only a limited number of glyphs (Latin, Cyrillic, Greek, Korean, Arabic) it gets rather clumsy if you want to support full internationalization.
There are two considerable options:
Scan the text to be displayed for all glyphs required, only render and upload those; this owever won't work so well for kanji and similar large scripts.
Render the whole text using FreeType and some layout library (Pango or similar) to a buffer.
I recommend rendering using 3 times the screen resolution using grayscale antialiasing. Then modulate with a pixel aligned filter mask texture, or using a fragment shader to implement sub-pixel antialiasing.
There is possible third method, but I never implemented it myself so far: Vector Textures. In essence you implement a antialiasing spline rasterizer in the fragment shader and supply spline parameters through samplers; this allows to render crisp text utilizing GPU acceleration.

Related

Unity: text is clear in editor but blurry in game

I've read that Unity has problems rendering clear text and I've tried out several different fixes: setting a large font size on the imported font and changing the character setting to unicode, making the text size large and then scaling it down, setting filter mode to point when it comes to pixel fonts... All these methods seem to work from the editor (as in, the text appears crisp), but in the game the text is still blurry:
vs.
(The screenshot is using free aspect - the text looks better, but still somewhat blurry, when the game view resolution matches the reference resolution.)
Currently, the imported font (not pixel) is set to a size of 180; rendering to smooth; character to unicode. The text isn't scaled right now and is at a size of 50. The canvas is set to scale with screen size, and the reference resolution is 2560x1440 (a Samsung S7). It should also be noted that at a different resolution, the box around the text (which is an image) also gets blurry, which makes me think it's a problem with the canvas scaling.
What am I missing?
Turns out all I had to do was uncheck Low Resolution Aspect Ratios in the game view... The text is still slightly blurry, but likely that's typical of Unity. To counter that, I just decided to use a pixel font (using the fix explained in this video).
I use scaling down as a workaround.
For me, it was using Unity-Remote. After building the app directly I didn't see the blur anymore.
I had the same problem, the best solution to the blurry text problem in Unity is as follows:
Select the text item from the scene.
Go to Font and see where your font is located in your project folder
Go to your font in your project
Change the font size from 16 to 100-300
change "Character" to "Unicode"
Press Apply

Ignore extra white space in Unity3D Texture

I have different textures for a player's helmet, shirt and pants in order to render custom uniforms. They have white space so it lays on the model correctly, but this is causing the App's file size to be huge once installed because the game has over a hundred items and each texture is 2.7 MB.
How can I tell Unity to ignore parts of the image or map the textures onto the player so that I do not need the white space? For example, cutting the whitespace out of the helmet image lowers the size to under a MB.
Thanks!
For the sake of others who read this:
The obvious answer is, cut out the empty spaces in an image editor. That will solve the problem in the way it really should be solved.
That being said, it's quite possible you are using poorly UV mapped models that need that space, and you are unable to fix this, as the person who asked this question is.
If you're in a position where it might cost a little time or money to get someone to fix it, you should, because no matter what, you're wasting space, and it will add up. No one wants a 100Mb download to get 50Mb worth of game. And if you payed someone for models and they came like this, consider taking it up with them, because this is a somewhat major flaw.
The "real" answer:
The first thing you should do is enable compression. From your picture it appears you are using the RGBA 16-bit format. This is a lower quality version of Truecolor, an uncompressed 32-bit format, but is not compressed in the "traditional" sense.
You should use the "Compressed" image import setting (To see it you must turn off Advanced settings). This will select one of several compression formats (depending on the platform), all of which are highly optimized. You can define a specific compression in the Advanced window, but it is rarely needed, as Unity is great at choosing the right one for a given situation, and can can take special cases (such as specific chipsets) into consideration.
Depending on the compression algorithm, that white space could easily end up taking next to no space, and depending on the image, the compression might end up virtually undetectable.
On average the "Compressed" setting can create several orders of magnitude of a reduction of image size.
From there, if your image is still to large you can experiment with import size. This creates a fairly linear change in space taken and quality image. You are importing at 1024x1024 right now. Importing at 512x512 will about half the amount of space taken, and half the resolution of your image, but depending on the art style, the change can often be negligible visually.
You can for more details on these changes in the documentation for the texture importer

What are the differences between APNG and MNG?

I know that APNG is an extension of PNG, while MNG is more of its own format (albeit developed by the original PNG developers). MNG is barely supported in any browser at all, while APNG almost only has native support in Firefox (for various backward compatibility- and decoding-related reasons, it seems).
Except all of these behind-the-scenes things, what are the differences between APNG and MNG? Does one have features the other doesn't (for example, storing only parts that are modified instead of always whole frames)? Does one have better performance or file size than the other?
APNG can create a new frame by replacing the entire image or by overlaying or blending a smaller image over part of it. To display a "pong" game you'd need a new image of the ball in each different location. APNG has essentially the same capabilities as animated GIF, but also allowing 24bit RGB and 8-bit alpha.
MNG can do that, plus it can also retrieve an image that was previously defined in the datastream and place it over the previous frame in a new location. To display your "pong" game you'd only need to transmit one image of the ball and use it like a sprite.
Much more detail is available in the specifications:
apng: (https://wiki.mozilla.org/APNG_Specification‎)
mng: (http://www.libpng.org/pub/mng/spec/mng-lc.html)

UIFont and Diacriticals

I'm writing a iPhone app that needs to render i18n text that includes diacriticals (tildes, accents, etc.). Apple provides the UIFont class which can be used to get a given typeface/font-size combination's leading, ascent, descent, etc.
The problem is that this information does not accurately reflect diacriticals. Specifically, diacriticals on capital letters often exceed the lineHeight (the UIFont property formerly known as leading).
The same problem exists throughout the frameworks, ie. NSString:sizeWithFont has the same issue.
I need to know the true bounding box for text as I am using OpenGL which does not have text drawing support and therefore requires rendering text to a texture.
Currently, I'm using a hack to get around this issue. Is there a better way?
It's not possible with NSString, since it just returns a size. You can try CoreText which seems to support returning bounding boxes, but that's a bit overkill.
It's a difficult problem when Unicode supports things like è̀̀̀ (see also: zalgo); things can render above the top of a line so you can't just draw the characters. Some text-drawing APIs make you specify the baseline and give you the bounding box so you can get both ascenders and descenders, but UIKit doesn't do this.
Then, you have crazy cursive fonts with the occasional huge ascender. It's unclear how to handle these either.
The lazy way is to render to a texture with margins at the top and bottom (0.5 lines? 1 line?) and not care too much about the extra overhead of some transparent pixels.
I haven't looked at CoreText much, but it doesn't look particularly promising.

Why do images for textures on the iPhone need to have power-of-two dimensions?

I'm trying to solve this flickering problem on the iphone (open gl es game). I have a few images that don't have pow-of-2 dimensions. I'm going to replace them with images with appropriate dimensions... but why do the dimensions need to be powers of two?
The reason that most systems (even many modern graphics cards) demand power-of-2 textures is mipmapping.
What is mipmapping?
Smaller versions of the image will be created in order to make the thing look correctly at a very small size. The image is divided by 2 over and over to make new images.
So, imagine a 256x128 image. This would have smaller versions created of dimensions 128x64, 64x32, 32x16, 16x8, 8x4, 4x2, 2x1, and 1x1.
If this image was 256x192, it would work fine until you got down to a size of 4x3. The next smaller image would be 2x1.5 which is obviously not a valid size. Some graphics hardware can deal with this, but many types cannot.
Some hardware also requires a square image but this isn't very common anymore.
Why do you need mipmapping?
Imagine that you have a picture that is VERY far away, so far away as to be only the size of 4 pixels. Now, when each pixel is drawn, a position on the image will be selected as the color for that pixel. So you end up with 4 pixels that may not be at all representative of the image as a whole.
Now, imagine that the picture is moving. Every time a new frame is drawn, a new pixel is selected. Because the image is SO far away, you are very likely to see very different colors for small changes in movement. This leads to very ugly flashing.
Lack of mipmapping causes problems for any size that is smaller than the texture size, but it is most pronounced when the image is drawn down to a very small number of pixels.
With mipmaps, the hardware will have access to 2x2 version of the texture, so each pixel on it will be the average color of that quadrant of the image. This eliminates the odd color flashing.
http://en.wikipedia.org/wiki/Mipmap
Edit to people who say this isn't true anymore:
It's true that many modern GPUs can support non-power-of-two textures but it's also true that many cannot.
In fact, just last week I had a 1024x768 texture in an XNA app I was working on, and it caused a crash upon game load on a laptop that was only about a year old. It worked fine on most machines though. It's a safe bet that the iPhone's gpu is considerably more simple than a full PC gpu.
Typically, graphics hardware works natively with textures in power-of-2 dimensions. I'm not sure of the implementation/construction details that cause this to be the case, but it's generally how it is everywhere.
EDIT: With a little research, it turns out my knowledge is a little out of date -- a lot of modern graphics cards can handle arbitrary texture sizes now. I would imagine that with the space limitations of a phone's graphics processor though, they'd probably need to omit anything that would require extra silicon like that.
You can find OpenGL ES support info about Apple Ipod/Iphone devices here:
Apple OpenES support
OpenGL ES 2.0 is defined as equal to OpenGL 2.0
The constraint about texture size's has been disappear only from version 2.0
So if you use OpenGL ES with version less then 2.0 - it is normal situation.
I imagine it's a pretty decent optimization in the graphics hardware to assume power-of-2 textures. I bought a new laptop, with latest laptop graphics hardware, and if textures aren't power-of-2 in Maya, the rendering is all messed up.
Are you using PVRTC compression? That requires powers of 2 and square images.
Try implementing wrapping texture-mapping in software and you will quickly discover why power-of-2 sized are desirable.
In short, you will find that if you can assume power-of-2 dimensions then a lot of integer multiplications and divisions turn into bit-shifts.
I would hazard a guess that the recent trend in relaxing this restriction is due to GPUs moving to floating-point maths.
Edit: The "because of mipmapping" answer is incorrect. Mipmapped, non-power-of-two textures are a common feature of modern GPUs.