Problem enabling OpenGL ES depth test on iPhone. What steps are necessary? - iphone

I remember running into this problem when I started using OpenGL in OS X. Eventually I solved it, but I think that was just by using glut and c++ instead of Objective-C...
The lines of code I have in init for the ES1Renderer are as follows:
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);
Then in the render method, I have this:
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
I assume I'm missing something specific to either the iPhone or ES. What other steps are required to enable the depth test?
Thanks

The instructions are here, if anyone else has this problem. The code is also below:
glGenRenderbuffersOES(1, &depthRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, depthRenderbuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT16_OES, 320, 480);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, depthRenderbuffer);
GLenum status = glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) ;
if(status != GL_FRAMEBUFFER_COMPLETE_OES) {
NSLog(#"failed to make complete framebuffer object %x", status);
}

You need to allocate the depth buffer itself. Allocate a new renderbuffer with the internal format DEPTH_COMPONENT16 or DEPTH_COMPONENT24, and attach it to the framebuffer object.

#define USE_DEPTH_BUFFER 1 if you're using the OpenGL ES project template. This sets up a depth buffer somewhere in EAGLView.m.

Related

CVOpenGLESTextureCacheCreateTextureFromImage from uint8_t buffer

I'm developing an video player for iPhone. I'm using ffmpeg libraries to decode frames of video and I'm using opengl 2.0 to render the frames to the screen.
But my render method is very slowly.
A user told me:
iOS 5 includes a new way to do this fast. The trick is to use AVFoundation and link a Core Video pixel buffer directly to an OpenGL texture.
My problem now is that my video player send to render method a uint8_t* type that I use then with glTexSubImage2D.
But if I want to use CVOpenGLESTextureCacheCreateTextureFromImage I need a CVImageBufferRef with the frame.
The question is: How I can create CVImageBufferRef from uint8_t buffer?
This is my render method:
- (void) render: (uint8_t*) buffer
{
NSLog(#"render");
[EAGLContext setCurrentContext:context];
glBindFramebuffer(GL_FRAMEBUFFER, defaultFramebuffer);
glViewport(0, 0, backingWidth, backingHeight);
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
// OpenGL loads textures lazily so accessing the buffer is deferred until draw; notify
// the movie player that we're done with the texture after glDrawArrays.
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, mFrameW, mFrameH, GL_RGB,GL_UNSIGNED_SHORT_5_6_5, buffer);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
[moviePlayerDelegate bufferDone];
glBindRenderbuffer(GL_RENDERBUFFER, colorRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER];
}
Thanks,
I am trying to do something similar.
Apparently, you need to create CVPixelBufferRef, and substitute it for CVImageBufferRef. I.e., you first create CVPixelBufferRef, as described at here (for download), and then get access to pixel buffer:
CVPixelBufferLockBaseAddress(renderTarget, 0);
_rawBytesForImage = (GLubyte *)CVPixelBufferGetBaseAddress(renderTarget);
(Code not mine).
For an actual working example that shows how video data can be passed directly to an OpenGL view see my blog post on the subject. The problem with looking at a series of "code pieces" around online is that you will not find actual complete working examples for iOS.

How can I draw (as in GLPaint) onto a background image, and with temporary drawings?

I am writing a GLPaint-esque drawing application for the iPad, however I have hit a stumbling block. Specifically, I am trying to implement two things at the moment:
1) A background image that can be drawn onto.
2) The ability to draw temporary shapes, e.g. you might draw a line, but the final shape would only be committed once the finger has lifted.
For the background image, I understand the idea is to draw the image into a VBO and draw it right before every line drawing. This is fine, but now I need to add the ability to draw temporary shapes... with kEAGLDrawablePropertyRetainedBacking set to YES (as in GLPaint) the temporary are obviously not temporary! Turning the retained backing property to NO works great for the temporary objects, but now my previous freehand lines aren't kept.
What is the best approach here? Should I be looking to use more than one EAGLLayer? All the documentation and tutorials I've found seem to suggest that most things should be possible with a single layer. They also say that retained backing should pretty much always be set to NO. Is there a way to work my application in such a configuration? I tried storing every drawing point into a continually expanding vertex array to be redrawn each frame, but due to the sheer number of sprites being drawn this isn't working.
I would really appreciate any help on this one, as I've scoured online and found nothing!
I've since found the solution to this problem. The best way appears to be to use custom framebuffer objects and render-to-texture. I hadn't heard of this before asking the question, but it looks like an incredibly useful tool for the OpenGLer's toolkit!
For those that may be wanting to do something similar, the idea is that you create a FBO and attach a texture to it (instead of a renderbuffer). You can then bind this FBO and draw to it like any other, the only difference being that the drawings are rendered off-screen. Then all you need to do to display the texture is to bind the main FBO and draw the texture to it (using a quad).
So for my implementation, I used two different FBOs with a texture attached to each - one for the "retained" image (for freehand drawing), and the other for the "scratch" image (for temporary drawings). Each time a frame is rendered, I first draw a background texture (in my case I just used the Texture2D class), then draw the retained texture, and finally the scratch texture if required. When drawing a temporary shape everything is rendered to the scratch texture, and this is cleared at the start of every frame. Once it is finished, the scratch texture is drawn to the retained texture.
Here are a few snippets of code that might be of use to somebody:
1) Create the framebuffers (I have only shown a couple here to save space!):
// ---------- DEFAULT FRAMEBUFFER ---------- //
// Create framebuffer.
glGenFramebuffersOES(1, &viewFramebuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
// Create renderbuffer.
glGenRenderbuffersOES(1, &viewRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
// Get renderbuffer storage and attach to framebuffer.
[context renderbufferStorage:GL_RENDERBUFFER_OES fromDrawable:layer];
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, viewRenderbuffer);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);
// Check for completeness.
status = glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES);
if (status != GL_FRAMEBUFFER_COMPLETE_OES) {
NSLog(#"Failed to make complete framebuffer object %x", status);
return NO;
}
// Unbind framebuffer.
glBindFramebufferOES(GL_FRAMEBUFFER_OES, 0);
// ---------- RETAINED FRAMEBUFFER ---------- //
// Create framebuffer.
glGenFramebuffersOES(1, &retainedFramebuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, retainedFramebuffer);
// Create the texture.
glColor4f(0.0f, 0.0f, 0.0f, 0.0f);
glGenTextures(1, &retainedTexture);
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, retainedTexture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 1024, 1024, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glBindTexture(GL_TEXTURE_2D, 0);
// Attach the texture as a renderbuffer.
glFramebufferTexture2DOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_TEXTURE_2D, retainedTexture, 0);
// Check for completeness.
status = glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES);
if (status != GL_FRAMEBUFFER_COMPLETE_OES) {
NSLog(#"Failed to make complete framebuffer object %x", status);
return NO;
}
// Unbind framebuffer.
glBindFramebufferOES(GL_FRAMEBUFFER_OES, 0);
2) Draw to the render-to-texture FBO:
// Ensure that we are drawing to the current context.
[EAGLContext setCurrentContext:context];
glBindFramebufferOES(GL_FRAMEBUFFER_OES, retainedFramebuffer);
glViewport(0, 0, 1024, 1024);
// DRAWING CODE HERE
3) Render the various textures to the main FBO, and present:
glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
glViewport(0, 0, backingWidth, backingHeight);
glClearColor(1.0f, 1.0f, 1.0f, 1.0f); // Clear to white.
glClear(GL_COLOR_BUFFER_BIT);
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
[self drawBackgroundTexture];
[self drawRetainedTexture];
[self drawScratchTexture];
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER_OES];
For example, drawing drawing the retained texture using [self drawRetainedTexture] would use the following code:
// Bind the texture.
glBindTexture(GL_TEXTURE_2D, retainedTexture);
// Destination coords.
GLfloat retainedVertices[] = {
0.0, backingHeight, 0.0,
backingWidth, backingHeight, 0.0,
0.0, 0.0, 0.0,
backingWidth, 0.0, 0.0
};
// Source coords.
GLfloat retainedTexCoords[] = {
0.0, 1.0,
1.0, 1.0,
0.0, 0.0,
1.0, 0.0
};
// Draw the texture.
glVertexPointer(3, GL_FLOAT, 0, retainedVertices);
glTexCoordPointer(2, GL_FLOAT, 0, retainedTexCoords);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
// Unbind the texture.
glBindTexture(GL_TEXTURE_2D, 0);
A lot of code, but I hope that helps somebody. It certainly had me stumped for a while!

Using depth buffer in opengl es 2.0 iphone

I followed a tutorial using depth buffer in opengl es 1.1. But I use opengl es 2.0. The implemented code results in an error: Failed to make complete framebuffer object 8cd6.
See implemented code below:
(void)createFramebuffer
{
if (context && !defaultFramebuffer)
{
[EAGLContext setCurrentContext:context];
// Create default framebuffer object.
glGenFramebuffers(1, &defaultFramebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, defaultFramebuffer);
// Create color render buffer and allocate backing store.
glGenRenderbuffers(1, &colorRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, colorRenderbuffer);
// Create depth render buffer
glGenRenderbuffers(1, &depthRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, depthRenderbuffer);
[context renderbufferStorage:GL_RENDERBUFFER fromDrawable:(CAEAGLLayer *)self.layer];
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &framebufferWidth);
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &framebufferHeight);
glBindRenderbuffer(GL_RENDERBUFFER, depthRenderbuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, framebufferWidth, framebufferHeight);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, colorRenderbuffer);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthRenderbuffer);
if (glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE)
NSLog(#"Failed to make complete framebuffer object %x", glCheckFramebufferStatus(GL_FRAMEBUFFER));
}
}
Regards
Niclas
The problem is that you’re twice binding storage to your Depth renderbuffer, and never to your Color renderbuffer. The -renderbufferStorage:fromDrawable: message to your EAGLContext binds a storage to the currently bound renderbuffer, which in your case is the Depth renderbuffer. Following, you’re binding storage to it again using the glRenderbufferStorage call.
The solution is to bind the Color renderbuffer prior to sending the storage message, so that the storage gets bounds to that. That is, insert a line glBindRenderbuffer(GL_RENDERBUFFER, colorRenderbuffer); directly above the message, after you create your Depth renderbuffer. It should work, I was able to reproduce your error and subsequently solve it in this way.
NB. Always make sure the correct buffers are bound. You can check using glGetIntegerv() for the binding, and glGetRenderbufferParameteriv() for additional parameters.

OpenGL-ES, iPhone and intermittent error: GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT_OES (0x8CD6)

I have an app that uses OpenGL-ES and an EAGLContext within a UIView - very much like Apple's GLPaint sample code app.
It might be significant that I see this bug on my iPhone 4 but not on my iPad.
Mostly, this works very well. However, I am getting GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT_OES from glCheckFramebufferStatusOES() within the createFrameBuffer method. The reason is that the backingWidth and backingHeight are both 0.
I am trying to understand the relation between )self.layer and its size - which is not (0,0) - and the values for backingWidth and backingHeight. My UIView and its CALayer both have the 'correct' size, while glGetRenderbufferParameterivOES() returns 0 for GL_RENDERBUFFER_WIDTH_OES and GL_RENDERBUFFER_HEIGHT_OES.
Here is my createFrameBuffer method - which works much of the time.
- (BOOL)createFramebuffer
{
// Generate IDs for a framebuffer object and a color renderbuffer
glGenFramebuffersOES(1, &viewFramebuffer);
glGenRenderbuffersOES(1, &viewRenderbuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
// This call associates the storage for the current render buffer with the EAGLDrawable (our CAEAGLLayer)
// allowing us to draw into a buffer that will later be rendered to screen wherever the layer is (which corresponds with our view).
[context renderbufferStorage:GL_RENDERBUFFER_OES fromDrawable:(id<EAGLDrawable>)self.layer];
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, viewRenderbuffer);
//DLog(#" backing size = (%d, %d)", backingWidth, backingHeight);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);
DLog(#" backing size = (%d, %d)", backingWidth, backingHeight);
err = glGetError();
if (err != GL_NO_ERROR)
DLog(#"Error. glError: 0x%04X", err);
// For this sample, we also need a depth buffer, so we'll create and attach one via another renderbuffer.
glGenRenderbuffersOES(1, &depthRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, depthRenderbuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT16_OES, backingWidth, backingHeight);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, depthRenderbuffer);
if(glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) != GL_FRAMEBUFFER_COMPLETE_OES)
{
NSLog(#"failed to make complete framebuffer object 0x%X", glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES));
return NO;
}
return YES;
}
When backingWidth and backingHeight are non-zero, then there is no error returned from glCheckFramebufferStatusOES().
I had this same problem. For me the fix was that in the opengl sample code of last year, Apple rebuilds the renderbuffer in every layoutSubviews call. Now, if you create an iPhone template opengl project, you will see that the layoutSubviews only destroys the renderbuffer. Then on every draw, if the render buffer is nil THEN create it. This is better cause when you are about to draw all CAlayers etc should be all shined up and ready to go.
I think that the render buffer in my case was trying to be built when the EagleView layer was not serviceable - i.e. in some tear down state. In any case when I changed my code to match it worked.
Also there are fewer calls to this code, which is likely faster. On startup there is a lot of scene loading and moving about, which generates 1/2 dozen layout sub view calls with my app.
Since the comments in Apple's code tend to be few and far between, the fact that there is one in the layoutsubviews call is significant:
// The framebuffer will be re-created at the beginning of the next
setFramebuffer method call.
--Tom
I had this same problem also, using Apple's OpenGL-ES sample code that does a destroyFramebuffer, createFramebuffer then drawView with the layoutSubviews function.
What you want to do is create the frame buffer in the drawView call as Tom says above, but additionally, you also want to defer the call to drawView until the layoutSubviews function returns. The way I did this was:
- (void) layoutSubviews
{
[EAGLContext setCurrentContext:context];
[self destroyFramebuffer];
// Create the framebuffer in drawView instead as needed, as before create
// would occasionally happen when the view wasn't servicable (?) and cause
// a crash. Additionally, send the drawView call to the main UI thread
// (ALA PostMessage in Win32) so that it is deferred until this function
// returns and the message loop has a chance to process other stuff, etc
// so the EAGLView will be ready to use when createFramebuffer is finally
// called and the glGetRenderbufferParameterivOES calls to get the backing
// width and height for the render buffer will always work (occasionally I
// would see them come back as zero on my old first gen phone, and this
// crashes OpenGL.)
//
// Also, using this original method, I would see memory warnings in the
// debugger console window with my iPad when rotating (not all the time,
// but pretty frequently.) These seem to have gone away using this new
// deferred method...
[self performSelectorOnMainThread:#selector(drawView)
withObject:nil
waitUntilDone:NO];
}
Ross

Why does calling glMatrixMode(GL_PROJECTION) give me EXC_BAD_ACCESS in an iPhone app?

I have an iphone app where I call these three functions in appDidFinishLaunching:
glMatrixMode(GL_PROJECTION);
glOrthof(0, rect.size.width, 0, rect.size.height, -1, 1);
glMatrixMode(GL_MODELVIEW);
When stepping through with the debugger I get EXC BAD ACCESS when I execute the first line. Any ideas why this is happening?
Btw I have another application where I do the same thing and it works fine. So I've tried to duplicate everything in that app (#imports, adding OpenGLES framework, etc) but now I'm just stuck.
I've run into this with OpenGL calls if two threads are attempting to draw to the OpenGL scene at once. However, that doesn't sound like what you're doing.
Have you properly initialized your display context and framebuffer before this call? For example, in my UIView subclass that does OpenGL drawing, I call the following in its initWithCoder: method:
context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES1];
if (!context || ![EAGLContext setCurrentContext:context] || ![self createFramebuffer])
{
[self release];
return nil;
}
The createFramebuffer method looks like the following:
- (BOOL)createFramebuffer
{
glGenFramebuffersOES(1, &viewFramebuffer);
glGenRenderbuffersOES(1, &viewRenderbuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
[context renderbufferStorage:GL_RENDERBUFFER_OES fromDrawable:(CAEAGLLayer*)self.layer];
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, viewRenderbuffer);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);
if (USE_DEPTH_BUFFER) {
glGenRenderbuffersOES(1, &depthRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, depthRenderbuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT16_OES, backingWidth, backingHeight);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, depthRenderbuffer);
}
if(glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) != GL_FRAMEBUFFER_COMPLETE_OES)
{
return NO;
}
return YES;
}
This is pretty much boilerplate code, as generated by the OpenGL ES Application template in XCode. Perhaps by not initializing things before calling glMatrixMode(), you're getting a crash.
Also, why are you doing OpenGL drawing in applicationDidFinishLaunching:? Wouldn't a view or view controller be a more appropriate place for OpenGL calls than your UIApplicationDelegate?
Unlikely to be the problem given the date on which you submitted the bug, but you'd also see something like this if you use the Apple example code and run on an ES 2.0 capable device, as it removes the matrix stack from the spec, though the function definitions will remain visible to the compiler since the device also supports ES 1.1.
I've seen this error in many different situations but never specifically in yours. It usually comes up as a result of the application trying to access memory that has already been released.
Can you confirm that rect is still allocated?
You need to replace the current matrix with the identity matrix before calling glOrthof. This can be done with glLoadIdentity()
Restart the iPhone Simulator. This issue is definitely due to the OpenGL context not being set properly. I found that sometimes the iPhone Simulator has issues and needs to be restarted for the OpenGL context to get set properly by [EAGLContext setCurrentContext:].