Previewing OpenGLES Render Buffers under Android - eclipse

I've started to play a bit with the Frame Buffer Objects and Render Buffers in OpenGLES. One thing that bugs me out is that I'm not able to see what data is currently in my Render Buffer instance, or simply put - what I've drawn within. I know that I could possibly draw my data into the texture and then simply sample it onto the rectangle, but I don't want to do that. Maybe somebody already used or is aware of some sort of a plugin, preferably an Eclipse plugin, or eventually an application that would present me with the graphical data of the Render Buffer of my choice?

I will answer my question myself, there are some tools dedicated for NVIDIA Tegra chipsets which are really helpful when dealing with problems within the OpenGL scope (PerfHUD ES for example)

Related

Need some guidelines on iPad animation programming

I'm creating an interactive e-book for the iPad. This book will contain multiple pages that will consist of a lot of animations (frame and motion animations), transitions,... I was wondering what my development options are, should I use OpenGL, Quartz,...?
I've use UIImageView.animationImages before and found that it had really bad performance. What's the best way to draw frame based animations?
Does anybody have some good pointers to resources on this?
thanks in advance,
Thomas
I guess that depends a bit on what you'll be drawing. If you have a need for 3D, then OpenGL is the way to go, but it doesn't sound like it. I have a feeling Quartz2D is going to be just fine for your 2D drawing needs. I've done drawing with both and they have a very similar API. I think the downside of using all the raw power of OpenGL is that you have then signed up for doing most of the work yourself. I don't recommend attempting to using Core Animation high level APIs to manipulate OpenGL views.
If you do use Quartz2D and "normal" UIViews instead of OpenGL/EAGLView, then you can take advantage the many pre-canned animations Apple already build with Core Animation. This include the card flip left/right, resizing, moving (x/y translation), rotation and the ever popular e-book page curl.
The best example of iBook like custom page curl functionality I could find is this example code from High Caffeine Content. However, you don't have to bring that much math to the table if you just want to use the out of the box Core Animation stuff. The bad performance you may have encountered could have been due to anything, including older/slower hardware. They have revved the graphics chips on the new devices.

OpenGL - to use or not to use ? why - iPhone application dev

I have to develop an application "Behavior like an Tetris game".
I have never used "OpenGL" for the iPhone application developement.
Application is something like this
Red / green / blue square boxes drop from top
Red + Red + Red = Points & boxes disappears
same way user has to make combination & get points
Different levels are there.
There are three buttons Left, Right for movement & bottom for speedy fall
For this kind of application should I use open GL or NOT?
i.e. Is it possible to develop entire application with view & it's animation?
If yes then, will it be more complex as compare to open gl?
What is the advantage of using open GL?
(I know that it gives good 2d, 3d look )
(But here my question means - easy coding?)
(Or open gl is more complicated as compare to objective-c?)
(I am just asking because I am not aware of it)
Basically your options are:
Using OpenGL
Using Quartz
Using UIKit
OpenGL is a fairly complicated beast, but is by far the best way to squeeze performance out of the iPhone. Do you need it for a Tetris game, though? Almost certainly not.
Quartz is the toolkit used in Mac OS X and the iPhone to draw images and do image effects. Because I come from an OpenGL background in other languages, I find Quartz strange and frustrating. However, it is probably easier for someone who is new to both.
You can do everything here using UIKit, and it will definitely be much much easier than other options. The main disadvantage is that it's rather slow in comparison, but once again doing a Tetris-like game shouldn't matter at all.
Before you go with UIKit, though, I recommend just checking out something like Cocos 2D, which will give you the advantages of OpenGL without the headache of dealing with all of its inner workings.
From the tone of your question it looks like you're confusing what OpenGL is and isn't with regard to Objective-C.
OpenGL is a library written in the C programming language (to put it simplistically) that excels at rendering shapes (especially 3D shapes) for display on a screen. It doesn't replace Objective-C inside your program, it merely assists you in drawing the shapes. If you don't use OpenGL, you'll need to write some sort of drawing/rendering code in your NSView (or subclass) to render the blocks. By using OpenGL, you will be provided a lot of helpful C methods for drawing shapes, which otherwise you'll have to implement yourself. On top of that OpenGL has thousands of man hours worth of drawing optimizations that you can take advantage of if you use it rather than trying to implement shape rendering yourself.
Having said that, OpenGL isn't all sunshine and roses. It works like a state machine and has its own assumptions about the way it will be used (like any API). Just because you know C and Objective-C doesn't mean that using OpenGL will be trivial. If you've never written any OpenGL code, I suggest you look into a reference like the venerable Red Book.
The thing to keep in mind is that OpenGL is not a language until itself (ignoring the OpenGL shading language). Its merely a set of C functions to aid you in rendering graphics.
You may well want to ask as well on http://iphonegamedev.stackexchange.com/, the new Stack Overflow variant just for iPhone gaming.
To learn & understand what you need.
Please go through following link.
it includes all necessary links for all kind of resources that you needed.
http://maniacdev.com/2009/04/8-great-resources-for-learning-iphone-opengl-es/
Edit :
After reading your question properly ( actually my question - By r & d I found solution).
I think - you need to develop a 2d application.
Go for the following link. Best option for 2d animation.
http://code.google.com/p/cocos2d-iphone/
Don't forget to visit following link, if you needed sample codes.
http://monoclestudios.com/cocos2d_whitepaper.html

How should I organize OpenGL ES 1.x 2D layer tree?

I'm developing a cute puzzle app - http://gotoandplay.freeblog.hu/categories/compactTangram/ - , and for performance reasons I decided to render the view with OpenGL. I started to learning it, I'm ok with buffers, vertices, textures in a really basic way.
The situation:
In the game user manipulates 7 puzzlePiece, each has 5 sublayers to get some pretty lighting feel. Most of the textures are 256x256. The user manipulates only one piece at a time, so the rest is unchanged during play. A skeleton of app without any graphic here: http://gotoandplay.freeblog.hu/archives/2009/11/11/compactTangram_v10_-_puzzle_completement_test/
The question:
How should I organize them? Is it a good idea to "predraw" the actual piece states in separate framebuffers(?)/textures(?) or I can simply redraw every piece/layers (1+7*5=36 sprite) in a timestep? If I use "predraw", then what should I do? Drawing to a puzzePiece framebuffer? Then how can I draw it into the scene framebuffer? Or is there a simplier way to "merge" textures?
Hope you can understand my question, if it seems too dim please take a look at my idea on how render an actual piece in my blog (there is a simple flash implemetation of what I'm gonna do) here: http://gotoandplay.freeblog.hu/archives/2010/01/07/compactTangram_072_-_tan_rendering_labs/
A common way of handling textures is to pack all your images into a 'texture atlas' at the start of the game/level.
Your maximum texture size is 1024x1024 and you can have about three of them in memory on the iPhone.
When you have all the images in these 'super textures' you can just draw the relevant area of the large texture. This has the advantage that you have to bind textures less often and you gain better performance, as well as cutting out any excess space used by the necessity to put small images in power-of-two size textures.

Need help optimizing my 2d drawing on iPhone

I'm writing a game that displays 56 hexagon pieces filling the screen in the shape of a board. I'm currently drawing each piece using a singleton rendering class that when called to draw a piece, creates a path from 6 points based of the coordinate passed in. This path is filled with a solid color and then a 59x59 png with an alpha to white gradient is overlayed over the drawing to give the piece a shiny look. Note I'm currently doing this in Core Graphics.
My first thought is that creating a path everytime I draw is costly and seems like I can somehow do this once and then reuse it, but I'm not sure of the best approach for this. When I look at the bottlenecks with Shark, it looks like the drawing of the png is the most taxing part of the process. I've tried just rendering the png overlay or just rendering the path without the overlay and both give me some frame gains, although removing the png overlay yields the most frames.
My current thought is that at startup, I should render 6 paths (1 for each color piece I have) and overlay them with the png and then store an image of these pieces and then just redraw the pieces each time I need them. Is there an effecient machanism for storing something you've drawn once and redrawing it? It kinda just sounds like I'd be running into the whole drawing pngs too often thing again, but maybe there's a less taxing method that does a similar thing...
Any suggestions are much appreciated.
Thanks!
You might try CGLayer or CALayer.
General thoughts:
Game programming on iPhone usually necessitates OpenGL. Core Graphics is a bit easier to work with, but OpenGL is optimized for speed.
Prerender this "shiny look" into the textures as much as is possible (as in: do it in Photoshop before you even insert them into your project). Alpha blending is hell on performance.
Maybe try PVRTC (also this tutorial) as it's a format used by iPhone's GPU's manufacturer. Then again, this could make things worse depending on where your bottleneck is.
If you really need speed you have to go the OpenGL route. Be careful if you want to mix OpenGL and Core Animation, they can conflict.
OpenGL is a pain if you haven't done much with it. It sounds like you could use Core Animation and make each tile a layer. CA doesn't call the redraw again unless you change something, so you should be able to just move that layer around without taking a big hit. Also note that CA stores the layer in the texture memory so it should be much faster.
Some others have mentioned that you should use OpenGL. Here's a nice introduction specifically for the iPhone: OpenGL ES from the Ground Up: Table of Contents
You might also want to look at cocos2d. It seems to be significantly faster than using CoreAnimation in my tests, and provides lots of useful stuff for games.

Performance and background images for OpenGL ES/iPhone

I'm developing a 2D game for the iPhone using OpenGL ES and I'd like to use a 320x480 bitmapped image as a persistent background.
My first thought was to create a 320x480 quad and then map a texture onto it that represents the background. So... I created a 512x512 texture with a 320x480 image on it. Then I mapped that to the 320x480 quad.
I draw this background every frame and then draw animated sprites on top of it. This works fine except that the drawing of all of these objects (background + sprites) is too slow.
I did some testing and discovered that my slowdown is in the pixel pipeline. Not surprisingly, the large background image is the main culprit. To prove this, I removed the background draw and everything else rendered very fast.
I am looking for advice on how to keep my background and also improve performance.
Here's some more info:
1) I am currently testing on the Simulator (still waiting on Apple for the license)
2) The background is a PVR texture squeezed down to 128k
3) I had hoped that there might be a way to cache this background into a color buffer but haven't had any luck with that. that may be due to my inexperience with OpenGL ES or it just might be a stupid idea that won't work :)
4) I realize that the entire background does not always have to refresh, just the parts that have been drawn over by the moving sprites. I started to look into techniques for refreshing (as necessary) parts of the the background either as separate textures or with a scissor box, however this seems less than elegant.
Any tips/advice would be greatly appreciated...
Thank you.
Do not do performance testing on the simulator. Ever!
The differences to the real hardware are huge. In both directions.
If you draw the background every frame:
Do not clear the framebuffer. The background will overdraw the whole thing anyway.
Do you really need a background texture ?
What about using a color gradient via vertex colors ?
Try using the 2bit mode for the texture.
Turn of all render steps that you do not need for the background.
E.g.: Lighting, Blending, Depth-Test, ...
If you could post some of your drawing code it would be a lot easier to help you.
If you're making a 2D game, is there any reason you aren't using an existing library? Specifically, the cocos2d for iPhone may be worth your time. I can't answer your question about how to fix the issue doing it all yourself, but I can say that I've done exactly what you're talking about (having one full screen background with sprites on top) with cocos2d and it works great. (Assuming 60 fps is fast enough for you.) You may have your reasons for doing it yourself, but if you can, I would highly suggest at least doing a quick prototype with cocos2d and seeing if that doesn't help you along. (Details and source for the iPhone version are here: http://code.google.com/p/cocos2d-iphone/)
Thanks to everyone who provided info on this. All of the advice helped out in one way or another.
However, I wanted to make it clear that the main issue here turned out to be the behavior of simulator itself (as implied by Andreas in his response). Once I was able to get the application on the device, it performed much, much better. I mention this because, prior to developing my game, I had seen a lot of posts that indicated that the device was much slower than the simulator. This might be true in some instances (e.g. general application logic) but in my experience, animation (particularly 3d transformations) are much faster on the device.
I dont have much experience with OpenGL ES, but this problem occurs generally.
Your idea about the 'color buffer' is good intuition, essentially you want to be storing your background as a frame buffer and loading it directly onto your rendering buffer before drawing the foreground.
In OpenGL this is fairly straight forward with Frame Buffer Objects (FBO's). Unfortunatly I dont think OpenGL ES supports them, but it might give you somewhere to start looking.
you may want to try using VBOs (Vertex Buffer Objects) and see if that speeds up things. Tutorial is here
In addition, I just saw, that since OpenGL ES v1.1, there is a function called glDrawTex (Draw Texture) that is designed for
fast rendering of background paintings, bitmapped font glyphs, and 2D framing elements in games
You could use frame buffer objects similar to the GLPaint example from Apple.
Use a texture atlas to minimize the number of draw calls you make. You can use glTexCoordPointer for setting your texture coordinates that maps each image to its correct position. Remember to set your vertex buffer too. Ideally one draw call will render your entire 2D scene.
Avoid enabling/disabling states where possible.