iPhone OpenGl render text - iphone

Does anyone know how can i render a string on the iphone? Its for displaying my frame per second with =p

There's no built-in way of rendering text in OpenGL but two more or less common techniques: Rendering the glyphs using geometry (less common) or using texture mapping (far more common). For your case texture mapping would be very easy: Set up a CGBitmapContext and render the text using Quartz. Then upload the image in the previously generated texture using glTexSubImage2D.
On the iPhone you could also just put a UILabel on top of you OpenGL view and let UIKit do the rendering. In my application this did not hit performance at all (even though Apple claims it does).

You can use a Texture2D and the initWithString method to draw text in OpenGL. See the crash landing example that is included in the iphone sdk.
You could also use a UILabel and have it on top of the opengl layer.

As said before, Texture2D is a good idea, but Crash Landing was removed from a lot of places in Apple, what you could do , is download the Cocos2d , and then extract the Texture2D class provided there ( it's the same class provided by Apple, but with a couple of more things )
Cocos 2D for iPhone

Related

iPhone Best Way To Display Many instances Of Small Image

Im just wondering what would the best way to display multiple instances of a small (10x1) image. I have an array of about 480 points and I would like to draw the image at each of these points to draw path. Would it be faster to use Core Graphics or should I be using something like cocos2d?
It depends on whether you need it to animate. Core Graphics is probably fine if you are drawing it once and then displaying it as an image, but it it will be really slow if you need to redraw it each frame.
UIKit is actually much quicker because UIView drawing is hardware accelerated, so you could just add a UIImageView for each point in the graph, but from my own experiments that will probably be too slow for realtime interaction if there is more than about 200 image views (at least if you want it to run on anything older than an iPhone 4S).
If you do need realtime performance, that really only leaves OpenGL, which is quite fiddly to set up unless you use a library like Cocos2D or Sparrow to simplify it. I'd suggest Sparrow for your purposes because Sparrow views can be used in a regular UIKit application, whereas Cocos2D provides a whole app framework and is harder to use for just a single view in an otherwise regular UIKit app.
http://www.sparrow-framework.org/
Without more context, another option is to use OpenGL and create a display list for the composite image.

Loading large background images in cocos2d

Im working on an iphone platforms game developed using cocos2d and box2d which has to use a very large image as a background, my question is this:
-which is the better way to load the image?? (I'm talking about an image which can be 14K pixels long), is it better to cut in smaller images an keep loading them as the player moves??
-Should I keep in memory as invisible or use the addChild method to load them as I need them and the removeChild to remove the previous one?
Thanks in advance, any answer will be welcome :)
You can't use textures larger then 1024x1024 or 2048x2048 for iphone/ipod 4 (maybe iPad too). So the only way to render such big image is rendering it's parts.
I would try to load the parts as the player moves (better in a separate thread).
Also maybe it is possible for you to use Parallax background. If so - use it.
If your image is made from a lot of identical parts then it's a good idea to use CCTMXTiledMap.

How to create a distorted screen

I want to show distorted image as error page for my application. If possible this can be a screenshot of home screen with some graphics distortion. Is this possible.
Thanks You.
As Daniel A. White's comment mentions, this probably will cause your application to be rejected from the App Store, but it can be accomplished in many ways. I think this technique would be acceptable if your own interface appeared broken, but not acceptable if you made any iOS supplied looks appear broken.
You could just use your favorite image editor (i.e. Photoshop) to distort a screen shot, and displayed it by putting it in a separate UIView. The image would be static. It couldn't react to the contents of your program's interface.
If your interface is drawn with OpenGL ES 2.0, you could draw your regular interface to a texture, then use that texture as input to another GLSL program that applied the distortion.

antialiasing iPhone OpenGLES

I need in antialiasing in iPhone 3G (OpenGL ES1.1), NOT iPhone 3Gs with OpenGL ES.2.0.
I've draw 3d model and have next: pixels on the edges of the model look like teeth.
I've try set any filters for texture, but this filters making ONLY texture INSIDE look better.
How can i make good antialising ?
May be i should use any smooth for drawing triangles ? If yes, then how it possible in OpenGL ES1.1 ?
thanks.
As of iOS 4.0, full-screen anti-aliasing is directly supported via an Apple extension to OpenGL. The basic concept is similar to epatel's suggestion: render the scene onto a larger framebuffer, then copy that down to a screen-sized framebuffer, then copy that buffer to the screen. The difference is, instead of creating a texture and rendering it onto a quad, the copy/sample operation is performed by a single function call (specifically, glResolveMultisampleFramebufferAPPLE()).
For details on how to set up the buffers and modify your drawing code, you can read a tutorial on the Gando Games blog which is written for OpenGL ES 1.1; there is also a note on Apple's Developer Forums explaining the same thing.
Thanks to Bersaelor for pointing this out in another SO question.
You can render into a larger FBO and then use that as a texture on a square.
Have a look at this article for an explanation.
Check out the EGL_SAMPLE_BUFFERS and EGL_SAMPLES parameters to eglChooseConfig(), as well as glEnable(GL_MULTISAMPLE).
EDIT: Hrm, apparently you're out of luck, at least as far as standardized approaches go. As mentioned in that thread you can render to a large off-screen texture and scale to a smaller on-screen quad or jitter the view matrix several times.
We found another way to achieve this. If you edit your textures and add for example a 2 pixel frame of transparent pixels, the colored pixels in the texture are blended with the transparent pixels when necessary giving a basic anti-aliasing effect. You can read the full article here in our blog.
The advantage of this approach is that you are not rendering a bigger image, or copying a buffer, or even worse, making a texture from a buffer, so there is no impact in performance.

Generating graphics at runtime with Cocos2D - How to display?

I'm trying to create dynamic graphics for my game, which I'm building with Cocos2D. The graphics generation will occur at predictable, finite points, such as level loading. I'm having a hard time figuring out how to actually draw this at runtime. From what I can tell, the easiest way would be to draw into a PNG file at runtime and then load an AtlasSprite based on the PNG file, but I can't seem to figure out if this is indeed the best way or how to go about doing it. Any suggestions?
I'm not sure how Cocos2D loads Sprites or Atlases so this is a more general answer.
It might be worth taking a look at the Texture2D class that comes with the old CrashLanding example app. It uses a bitmap graphics context to generate a texture of a string for drawing with OpenGL. The code uses the CGBitmapContextCreate function to create a context. You can draw whatever you want onto it.
Then once you've finished drawing, you can either save the file as a PNG or you can call glTexImage2D on the data to use it with OpenGL.
There's more information about it in the Graphics and Drawing
documentation, specifically the section: Creating and Drawing Images.
Edit: It looks like Cocos2D comes with Texture2D so you should be in good shape. Check out the initWithString method here.