Using CVOpenGLESTextureRef as a render target? - iphone

I am trying to figure out how to use CVOpenGLESTextureRef and CVOpenGLESTextureCacheRef to replace using glReadPixels.
I understand how to use them to create a texture from incoming camera images as shown in the RosyWriter and CameraRipple demos. But I can't figure out how to use them to go the other way.
Comments in the header file for the function CVOpenGLESTextureCacheCreateTextureFromImage give the following example:
CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache,
pixelBuffer, NULL, GL_RENDERBUFFER, GL_RGBA8_OES, width, height,
GL_RGBA, GL_UNSIGNED_BYTE, 0, &outTexture);
and this is all the information I can find. How do you use this?
Currently I am doing the following to create my offscreen Frame Buffer Object at the start of the app.
glGenFramebuffersOES(1, &frameBufferHandle);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, frameBufferHandle);
glGenRenderbuffersOES(1, &colorBufferHandle);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, colorBufferHandle);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_RGBA8_OES, width, height);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES,
GL_COLOR_ATTACHMENT0_OES,GL_RENDERBUFFER_OES, colorBufferHandle);
Then when I need to write it out to disk I bind it and call glReadPixels.
How would I use CVOpenGLESTextureRef and CVOpenGLESTextureCacheRef instead of the above?
How would I use it on the main render buffer, and not on an offscreen FBO?
Other information that may be pertinent:
I am using OpenGL ES 1.1 on 2.0 capable devices (moving to 2.0 is not an option).
I am already using CVOpenGLESTextureRef and CVOpenGLESTextureCacheRef to display the camera video image on screen.
I am writing out to video using CVPixelBufferRefs and BRGA format.
If I use the main renderbuffer I can call glReadPixels with GL_BGRA_EXT.
If I use an offscreen FBO (for a smaller video size) I have to use RGBA format and do bit swizzling.

Related

OpenGL ES 2.x: How to Discard Depth Buffer glDiscardFramebufferEXT?

I read iOS OpenGL ES Logical Buffer Loads that a performance gain can be reached by "discarding" your depth buffer after each draw cycle. I try this, but it's as my game engine is not rendering any longer. I am getting an glError 1286, or GL_INVALID_FRAMEBUFFER_OPERATION_EXT, when I try to render the next cycle.
I get the feeling I need to initialize or setup the depth buffer each cycle if I'm going to discard it, but I can't seem to find any information on this. Here is how I init the depth buffer (all buffers, actually):
// ---- GENERAL INIT ---- //
// Extract width and height.
int bufferWidth, bufferHeight;
glGetRenderbufferParameteriv(GL_RENDERBUFFER,
GL_RENDERBUFFER_WIDTH, &bufferWidth);
glGetRenderbufferParameteriv(GL_RENDERBUFFER,
GL_RENDERBUFFER_HEIGHT, &bufferHeight);
// Create a depth buffer that has the same size as the color buffer.
glGenRenderbuffers(1, &m_depthRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, m_depthRenderbuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT24_OES, GAMESTATE->GetViewportSize().x, GAMESTATE->GetViewportSize().y);
// Create the framebuffer object.
GLuint framebuffer;
glGenFramebuffers(1, &framebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, framebuffer);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0,
GL_RENDERBUFFER, m_colorRenderbuffer);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT,
GL_RENDERBUFFER, m_depthRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, m_colorRenderbuffer);
And here is what I'm trying to do to discard the depth buffer at the end of each draw cycle:
// Discard the depth buffer
const GLenum discards[] = {GL_DEPTH_ATTACHMENT, GL_COLOR_ATTACHMENT0};
glBindFramebuffer(GL_FRAMEBUFFER, m_depthRenderbuffer);
glDiscardFramebufferEXT(GL_FRAMEBUFFER,1,discards);
I call that immediately following all of my draw calls and...
[m_context presentRenderbuffer:GL_RENDERBUFFER];
Any ideas? Any info someone could point me to? I tried reading through Apple's guide on the subject (where I got the original idea), http://developer.apple.com/library/ios/#documentation/3DDrawing/Conceptual/OpenGLES_ProgrammingGuide/WorkingwithEAGLContexts/WorkingwithEAGLContexts.html, but it doesn't seem to work quite right for me.
Your call to glDiscardFramebufferEXT(GL_FRAMEBUFFER,1,discards) is saying that you are discarding just 1 framebuffer attachment, however your discards array includes two: GL_DEPTH_ATTACHMENT and GL_COLOR_ATTACHMENT0.
Try changing it to:
glDiscardFramebufferEXT(GL_FRAMEBUFFER, 2, discards);
In fact, you say that you are discarding these framebuffer attachments at the end of the draw cycle, but directly before [m_context presentRenderbuffer:GL_RENDERBUFFER];. You are discarding the colour renderbuffer attachment that you need in order to present the renderbuffer - perhaps try just discarding the depth attachment, as this is no longer needed at this point.
You only need to initialise your buffers once, not every draw cycle. The glDiscardFramebufferEXT() doesn't actually delete your framebuffer attachment - it is simply a hint to the API to say that the contents of the renderbuffer are not needed in that draw cycle after the discard command completes. From Apple's OpenGL ES Programming Guide for iOS:
A discard operation is defined by the EXT_discard_framebuffer
extension and is available on iOS 4.0 and later. Discard operations
should be omitted when your application is running on earlier versions
of ioS, but included whenever they are available. A discard is a
performance hint to OpenGL ES; it tells OpenGL ES that the contents of
one or more renderbuffers are not used by your application after the
discard command completes. By hinting to OpenGL ES that your
application does not need the contents of a renderbuffer, the data in
the buffers can be discarded or expensive tasks to keep the contents
of those buffers updated can be avoided.

How to save and redraw screen content in OpenGL ES

I’m working on a kind of iPhone game where player travels through programmically generated wormhole. To draw the wormhole I chose to draw to arrays of textured vertical lines a pixel width to implement top and bottom walls of the wormhole. Every frame all the lines must be shifted left to implement the player movement and new lines must be drown in free space at right. But drawing 1000 textured rectangles every frame is killing my FPS.
And I’m looking for a solution to save all the lines that was drown at previous frame and redraw them altogether to the new shifted position.
It would be terrific if there is a way to draw textured rectangles in some kind of buffer that is bigger than screen, and then render this buffer to the screen.
I guest these are newbie questions cause I’m totally new in OpenGL.
I spent hours trying to figure this out, but haven’t succeeded. So Any help appreciated.
To expand on #Jerry's answer, I'll walk you through the steps, since you're new. First, we'll create the frame buffer object:
GLuint framebuffer;
glGenFramebuffersOES(1, &framebuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, framebuffer);
Next, we'll create the empty texture to hold our snapshot. This is just the usual OpenGL texture creation stuff, and you can modify it to fit your needs, of course. The only line to notice is the glTexImage2D line - note that instead of pixel data as the last coordinate, you can pass NULL, which creates an empty texture.
GLuint texture;
glGenTextures(1, &texture);
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, texture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
glBindTexture(GL_TEXTURE_2D, 0);
glDisable(GL_TEXTURE_2D);
Now we bind the texture to the frame buffer:
glFramebufferTexture2DOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_TEXTURE_2D, texture, 0);
and check to see if everything went OK:
if(glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) != GL_FRAMEBUFFER_COMPLETE_OES)
return false; // uh oh, something went wrong
Now we're all set up to draw!
glBindFramebufferOES(GL_FRAMEBUFFER_OES, framebuffer);
// do drawing here
glBindFramebufferOES(GL_FRAMEBUFFER_OES, 0);
And finally, clean up the frame buffer, if you don't need it any more:
glDeleteFramebuffersOES(1, &framebuffer);
Some caveats:
The size of the frame buffer must be a power-of-two.
You can go up to 1024x1024 on the latest iPhone, but there may be no need to have that level of detail.
Offhand I don't know the exact size that'll be available on a particular model of iPhone, but the general idea would be to use a Frame Buffer Object (FBO) to render to a texture, then you can blit pieces from that texture to the screen buffer.

How to switch between rendering and presenting framebuffers in iPhone OpenGL ES 2.0?

iPhone OpenGL ES 2.0..
First frame, render to my framebuffer then present it (as it works by default in the template OpenGL ES application).
On the next frame, I want to use that rendered framebuffer as an input in to my shaders, while rendering to another framebuffer and presenting that 2nd framebuffer.
The next frame, I want to use framebuffer2 as input in to my shaders, while rendering to the first framebuffer again.
Repeat
How do I do this?
You should be able to set up a renderbuffer that has a texture backing it using code like the following:
// Offscreen position framebuffer object
glGenFramebuffers(1, &positionFramebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, positionFramebuffer);
glGenRenderbuffers(1, &positionRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, positionRenderbuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_RGBA8_OES, FBO_WIDTH, FBO_HEIGHT);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, positionRenderbuffer);
// Offscreen position framebuffer texture target
glGenTextures(1, &positionRenderTexture);
glBindTexture(GL_TEXTURE_2D, positionRenderTexture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, FBO_WIDTH, FBO_HEIGHT, 0, GL_RGBA, GL_UNSIGNED_BYTE, 0);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, positionRenderTexture, 0);
Switching buffers is as simple as using code like this:
glBindFramebuffer(GL_FRAMEBUFFER, positionFramebuffer);
glViewport(0, 0, FBO_WIDTH, FBO_HEIGHT);
You can then render to that buffer and display the resulting texture by passing it into a simple shader that displays it within a rectangular piece of geometry. That texture can also be fed into a shader which renders into another similar renderbuffer that is backed by a texture, and so on.
If you need to do some CPU-based processing or readout, you can use glReadPixels() to pull in the pixels from this offscreen renderbuffer.
For examples of this, you can try out my sample applications here and here. The former does processing of video frames from a camera, with one of the settings allowing for a passthrough of the video while doing processing in an offscreen renderbuffer. The latter example renders into a cube map texture at one point, then uses that texture to do environment mapping on a teapot.

OpenGL ES iPhone - drawing anti aliased lines

Normally, you'd use something like:
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_BLEND);
glEnable(GL_LINE_SMOOTH);
glLineWidth(2.0f);
glVertexPointer(2, GL_FLOAT, 0, points);
glEnableClientState(GL_VERTEX_ARRAY);
glDrawArrays(GL_LINE_STRIP, 0, num_points);
glDisableClientState(GL_VERTEX_ARRAY);
It looks good in the iPhone simulator, but on the iPhone the lines get extremely thin and w/o any anti aliasing.
How do you get AA on iPhone?
One can achieve the effect of anti aliasing very cheaply using vertices with opacity 0.
Here's an image example to explain:
Comparison with AA:
You can read a paper about this here:
http://research.microsoft.com/en-us/um/people/hoppe/overdraw.pdf
You could do something along this way:
// Colors is a pointer to unsigned bytes (4 per color).
// Should alternate in opacity.
glColorPointer(4, GL_UNSIGNED_BYTE, 0, colors);
glEnableClientState(GL_COLOR_ARRAY);
// points is a pointer to floats (2 per vertex)
glVertexPointer(2, GL_FLOAT, 0, points);
glEnableClientState(GL_VERTEX_ARRAY);
glDrawArrays(GL_TRIANGLE_STRIP, 0, points_count);
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_COLOR_ARRAY);
Starting in iOS Version 4.0 you have an easy solution, it's now possible to use Antialiasing for the whole OpenGL ES scene with just a few lines of added code. (And nearly no performance loss, at least on the SGX GPU).
For the code please read the following Apple Dev-Forum Thread.
There are also some sample pictures how it looks for me on my blog.
Using http://answers.oreilly.com/topic/1669-how-to-render-anti-aliased-lines-with-textures-in-ios-4/ as a starting point, I was able to get anti-aliased lines like these:
They aren't perfect nor are they as nice as the ones that I had been drawing with Core Graphics, but they are pretty good. I am actually drawing same lines (vertices) twice - once with bigger texture and color, then with smaller texture and translucent white.
There are artifacts when lines overlap too tightly and alphas start to accumulate.
One approach around this limitation is tessellating your lines into textured triangle strips (as seen here).
The problem is that on the iPhone OpenGl renders to a frame buffer object rather than the main frame buffer and as I understand it FBO's don't support multisampling.
There are various tricks that can be done, such as rendering to another FBO at twice the display size and then relying on texture filtering to smooth things out, not something that I've tried though so can't comment on how well this works.
I remember very specifically that I tried this and there is no simple way to do this using OpenGL on the iPhone. You can draw using CGPaths and a CGContextRef, but that will be significantly slower.
Put this in your render method and setUpFrame buffer...
You will get anti-aliased appearance.
/*added*/
//[_context presentRenderbuffer:GL_RENDERBUFFER];
//Bind both MSAA and View FrameBuffers.
glBindFramebuffer(GL_READ_FRAMEBUFFER_APPLE, msaaFramebuffer);
glBindFramebuffer(GL_DRAW_FRAMEBUFFER_APPLE, framebuffer );
// Call a resolve to combine both buffers
glResolveMultisampleFramebufferAPPLE();
// Present final image to screen
glBindRenderbuffer(GL_RENDERBUFFER, _colorRenderBuffer);
[_context presentRenderbuffer:GL_RENDERBUFFER];
/*added*/

Texture2D iPhone SDK openGL

I'm using the Texture2D class in an iPhone game using OpenGL ES.
Are their any good tutorials for understanding the Texture2D class?
Specifically I'm looking at the initWithString method for printing text. As the way it is implemented, you get white text when you use it. I would like to modify the method so I could specify the RGB color of the text. Any help / pointers?
Because the class uses an alpha-only texture (read the code!), it will display in whatever color glColor has set. See this line in initWithData (which gets called by initWithString):
glTexImage2D(GL_TEXTURE_2D, 0, GL_ALPHA,
width, height, 0, GL_ALPHA,
GL_UNSIGNED_BYTE, data);
For red text, just call glColor4ub(255, 0, 0, 255) prior to drawing the texture.
Make sure you enable GL_BLEND and GL_COLOR_MATERIAL prior to drawing.
The class is small. I recommend you just read it.