destroyFramebuffer method in an openGL app - iphone

in my OpenGL type of iphone app, I have this method below in the GLView implementation.
i noticed everytime i started up the app during development that the last frame that was present when i shut the app down the last time, would draw first before the app started its animation...
but when i commented out this method, the app would start up just fine without the old end frame from the last time.
so i have something wrong in here. possibly directly in this method... any ideas?
(now i'm getting a blank white screen (as a first frame) just before the animation, so something is amiss here)
//------------------------------------------------------------------------------------
- (void)destroyFramebuffer
{
NSLog(#"destroyFramebuffer");
glDeleteFramebuffersOES(1, &viewFramebuffer);
viewFramebuffer = 0;
glDeleteRenderbuffersOES(1, &viewRenderbuffer);
viewRenderbuffer = 0;
if(depthRenderbuffer)
{
glDeleteRenderbuffersOES(1, &depthRenderbuffer);
depthRenderbuffer = 0;
}
}
here below is how the "createFrameBuffer" method looks
(the way the GLView implementation was starting up was with these 3 lines):
[EAGLContext setCurrentContext:context];
// [self destroyFramebuffer];
[self createFramebuffer];
//------------------------------------------------------------------------------------
- (BOOL)createFramebuffer
{
NSLog(#"createFramebuffer");
glGenFramebuffersOES(1, &viewFramebuffer);
glGenRenderbuffersOES(1, &viewRenderbuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
[context renderbufferStorage:GL_RENDERBUFFER_OES fromDrawable:(CAEAGLLayer*)self.layer];
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, viewRenderbuffer);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);
if (USE_DEPTH_BUFFER)
{
glGenRenderbuffersOES(1, &depthRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, depthRenderbuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT16_OES, backingWidth, backingHeight);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, depthRenderbuffer);
}
if(glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) != GL_FRAMEBUFFER_COMPLETE_OES)
{
NSLog(#"failed to make complete framebuffer object %x", glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES));
return NO;
}
return YES;
}

When you destroy your renderbuffer the memory is reclaimed. The chances of the same memory being used again next time are quite high, but not always guaranteed. You simply need to clear the buffer on creation: (or at least render something that fills the buffer)
glClearColor(0, 0, 0, 0);
glClear(GL_COLOR_BUFFER_BIT);
Also it's worth changing your destroy method to check that your framebuffer and renderbuffer even exist, otherwise you may be requesting GL to delete things that you haven't made. For eg:
if (viewFramebuffer) {
glDeleteFramebuffers(1, &viewFramebuffer), viewFramebuffer = 0;
}
if (viewRenderbuffer) {
glDeleteRenderbuffers(1, &viewRenderbuffer), viewRenderbuffer = 0;
}

Related

iphone opengl createFramebuffer problem

now i'm working on opengl to show a view.
My intension is to update opengl view in real-time, if i change the vertices data, the view also changed. But no luck that there is a bug in Xcode.
It is known as createFramebuffer NG for backingWidth/backingHeight is 0 when layout opengl view in the 2nd time.
1.I also tried put createFramebuffer in drawview, also NG.
2.when glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) != GL_FRAMEBUFFER_COMPLETE_OES, i also tried to creat framebuffer again, but also NG.
Other guy also found this:OpenGL-ES, iPhone and intermittent error: GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT_OES (0x8CD6)
Anybody else encounter this problem? How did you deal with it?
Thank you in advance!
- (BOOL)createFramebuffer {
NSLog(#"createFramebuffer");
//******************************************************
//Create the framebuffer and renderbuffer object
//******************************************************
glGenFramebuffersOES(1, &viewFramebuffer);
glGenRenderbuffersOES(1, &viewRenderbuffer);
//*******************************************************************
//Bind the framebuffer and renderbuffer object to the pipeline
//*******************************************************************
glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
//******************************************************
//Allocate storage
//******************************************************
[context renderbufferStorage:GL_RENDERBUFFER_OES fromDrawable:(CAEAGLLayer*)self.layer];
//******************************************************
//Attach renderbuffer object to framebuffer object
//******************************************************
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, viewRenderbuffer);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);
NSLog(#" backing size = (%d, %d)", backingWidth, backingHeight);
if (USE_DEPTH_BUFFER) {
glGenRenderbuffersOES(1, &depthRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, depthRenderbuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT16_OES, backingWidth, backingHeight);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, depthRenderbuffer);
}
if(glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) != GL_FRAMEBUFFER_COMPLETE_OES) {
NSLog(#"failed to make complete framebuffer object %x", glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES));
return NO;
}
return YES;
}
I got a work-around way, but cost more memory, init the opengl view again when the vertices data was changed.
Anybody else has good approach?

OpenGL ES in another thread is not drawing on physical device

I have put my OpenGL ES initialization in another thread with CAEAGLLayer.
context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES1 sharegroup:group];
if (!context || ![EAGLContext setCurrentContext:context])
{
[self release];
}
// Create system framebuffer object. The backing will be allocated in -reshapeFramebuffer
glGenFramebuffersOES(1, &viewFramebuffer);
glGenRenderbuffersOES(1, &viewRenderbuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
[context renderbufferStorage:GL_RENDERBUFFER_OES fromDrawable:eaglLayer];
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, viewRenderbuffer);
and then draw it to screen
BOOL rel = [EAGLContext setCurrentContext:context];
drawPixels(backingWidth, backingHeight, framebuf1, texID);
rel = [context presentRenderbuffer:GL_RENDERBUFFER_OES];
the buffer is displaying in simulator, but on a device, all I get is a black screen.
do I need to configure something else?
OpenGL contexts can be active in only one thread on a time. So you've first to detach the context from one thread and reattach it in another one. It looks like your code lacks the detaching.

Request a DepthBuffer in OpenGL ES for iPhone

I'm creating a 3D OpenGL ES view on the iPhone and want to set up a depth buffer, so I can use it. I'm calling glEnable(GL_DEPTH_TEST) and such, but because I haven't set up the z-buffer, it does nothing.
I'm looking for an equivalent call to
glutInitDisplayMode(GLUT_DEPTH)
Any help would be most welcome. Thanks!
As you suspect, you've no depth buffer. You'll need to attach a depth buffer to your frame buffer in whatever UIView subclass you've created that uses a CAEAGLLayer as its layer.
Supposing you're working with Apple's OpenGL ES Xcode template, the relevant UIView subclass is EAGLView. There's a method in there, createFramebuffer, that is responsible for creating the frame buffer. Initially it'll say:
- (void)createFramebuffer
{
if (context && !defaultFramebuffer)
{
[EAGLContext setCurrentContext:context];
// Create default framebuffer object.
glGenFramebuffers(1, &defaultFramebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, defaultFramebuffer);
// Create color render buffer and allocate backing store.
glGenRenderbuffers(1, &colorRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, colorRenderbuffer);
[context renderbufferStorage:GL_RENDERBUFFER fromDrawable:(CAEAGLLayer *)self.layer];
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &framebufferWidth);
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &framebufferHeight);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, colorRenderbuffer);
if (glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE)
NSLog(#"Failed to make complete framebuffer object %x", glCheckFramebufferStatus(GL_FRAMEBUFFER));
}
}
What that does is generates and binds a frame buffer, then it generates and binds a colour render buffer, gifts the colour buffer the inherent storage that comes with a CAEAGLLayer, grabs the frame dimensions for later and attaches the colour buffer to the render buffer.
You need also to create and attach a depth buffer. Which should be as simple as (with a suitable instance variable added for depthRenderbuffer; typing directly in here):
glGenRenderbuffers(1, &depthRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, depthRenderbuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, framebufferWidth, framebufferHeight);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthRenderbuffer);
Which does what it looks like it does — generates and binds a render buffer, allocates it storage to be a 16bit depth buffer of the same dimensions as the colour buffer and then attaches it to the frame buffer.
So, in total (untested):
- (void)createFramebuffer
{
if (context && !defaultFramebuffer)
{
[EAGLContext setCurrentContext:context];
// Create default framebuffer object.
glGenFramebuffers(1, &defaultFramebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, defaultFramebuffer);
// Create color render buffer and allocate backing store.
glGenRenderbuffers(1, &colorRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, colorRenderbuffer);
[context renderbufferStorage:GL_RENDERBUFFER fromDrawable:(CAEAGLLayer *)self.layer];
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &framebufferWidth);
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &framebufferHeight);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, colorRenderbuffer);
// Create depth render buffer and allocate backing store.
glGenRenderbuffers(1, &depthRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, depthRenderbuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, framebufferWidth, framebufferHeight);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthRenderbuffer);
if (glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE)
NSLog(#"Failed to make complete framebuffer object %x", glCheckFramebufferStatus(GL_FRAMEBUFFER));
}
}

glDrawArrays crash with EXC_BAD_ACCESS

I'm writing an iPhone application which uses UIView with a CAEAGLayer as its layer. Everything is fine and working apart from 1 small problem: sometimes it crashes with EXC_BAD_ACCESS and the following stack trace:
[EAGLView draw]
glDrawArrays_Exec
PrepareToDraw
DrawFramebufferMakeResident
AttachmentMakeResident
TextureMakeResident
memmove
it crashes on the line:
glVertexPointer(3, GL_FLOAT, 0, vertexCoordinates);
glTexCoordPointer(2, GL_FLOAT, 0, textureCoordinates);
glBindTexture(GL_TEXTURE_2D, textures[kActiveSideLeft]);
glDrawArrays(GL_TRIANGLE_STRIP, 0, totalPoints); //<--Crash here
Application will only crash during interface rotation change (that also happens to be the only case when the view frame changes). It doesn't crash often; most of the time it takes 3-5 minutes of rotating device to reproduce this problem.
I believe I'm making a mistake which is related to CAEAGLLayer initialization / frame change since that's where it crashes (I believe).
So here are the init and layout subviews methods:
Init:
...
CAEAGLLayer *eaglLayer = (CAEAGLLayer *)self.layer;
eaglLayer.opaque = TRUE;
context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES1];
if (!context || ![EAGLContext setCurrentContext:context])
{
[self release];
return nil;
}
glGenFramebuffersOES(1, &defaultFramebuffer);
glGenRenderbuffersOES(1, &colorRenderbuffer);
glGenRenderbuffersOES(1, &depthRenderbuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, defaultFramebuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, colorRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, depthRenderbuffer);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, colorRenderbuffer);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, depthRenderbuffer);
...
On set frame I only set GL_MODELVIEW and GL_POJECTION matrixes, so I guess nothing bad can happen there.
LayoutSubviews:
- (void)layoutSubviews
{
[EAGLContext setCurrentContext:context];
glBindRenderbufferOES(GL_RENDERBUFFER_OES, colorRenderbuffer);
[context renderbufferStorage:GL_RENDERBUFFER_OES fromDrawable:(CAEAGLLayer*)self.layer];
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, depthRenderbuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT16_OES, backingWidth, backingHeight);
NSAssert1(glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) == GL_FRAMEBUFFER_COMPLETE_OES, #"Failed to make complete framebuffer object: %X", glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES));
}
Draw method itself looks like:
if ([EAGLContext currentContext] != context) {
[EAGLContext setCurrentContext:context];
}
glBindFramebufferOES(GL_FRAMEBUFFER_OES, defaultFramebuffer);
glViewport(0, 0, backingWidth, backingHeight);
...//drawing different triangle strips here
glBindRenderbufferOES(GL_RENDERBUFFER_OES, colorRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER_OES];
I'd appreciate any comments on the listed code or suggestions about how can I find out the cause of this bug.
I'd be suspicious of the totalPoints variable being passed to drawArrays, or maybe your values for vertexCorrdinates or textureCoordinates, if those arrays aren't static. Your crash implies that you're walking off the end of memory while drawing the arrays. I'm less suspicious of your GL setup and more concerned about your memory management, and what you're drawing that's different during the rotation.
(Also, FWIW, I don't think you should be calling RenderBufferStorage every time you bind the render buffer(s). You should only need to do that once when you create them. That said, I'm not sure that you shouldn't actually destroy the buffers when you change their size and just recreate them from scratch.)

Using depth buffer on iOS with OpenGL ES 1.1

I'm trying to get a depth buffer working so I can start making a game for iOS. I'm fine with OpenGL and Objective-C for OSX but this is my first time making an iphone application.
I removed the ES1Renderer and ES2Renderer classes and moved the ES1 code into the EAGLView class. The default animation worked when I did that. When I tried to use perspective it went wrong. I get a white screen now. Can someone tell me what I'm doing wrong?
In the initialisation method:
// Create default framebuffer object.
glGenFramebuffersOES(1, &defaultFramebuffer);
glGenRenderbuffersOES(1, &colorRenderbuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, defaultFramebuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, colorRenderbuffer);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, colorRenderbuffer);
//Initialisation code for the game's graphics and to develop the scene
glEnable(GL_DEPTH_TEST); //Enables Depth Testing
glDepthFunc(GL_LEQUAL);
glDepthMask(GL_TRUE);
//glEnable(GL_CULL_FACE);
//glCullFace(GL_FRONT);
//Setup projection matrix
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
double xmax = 0.04142135624 * ((float) backingWidth)/backingHeight;
glFrustumf(-xmax, xmax, -0.04142135624, 0.04142135624, 0.1, 2000); //The ymax and min have been precalculated
glMatrixMode(GL_MODELVIEW); //Select The Modelview Matrix
glLoadIdentity();
In the drawView method:
static const GLfloat squareVertices[] = {
1, 1,-0.1f,
-1, 1,-0.1f,
1, -1,0.1f,
-1, -1,0.1f,
};
static const GLubyte squareColors[] = {
0,0,200,255,
40,90,250,255,
0,0,200,255,
50,100,230,255,
};
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glClear(GL_DEPTH_BUFFER_BIT);
glVertexPointer(3, GL_FLOAT, 0, squareVertices);
glEnableClientState(GL_VERTEX_ARRAY);
glColorPointer(4, GL_UNSIGNED_BYTE, 0, squareColors);
glEnableClientState(GL_COLOR_ARRAY);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
[context presentRenderbuffer:GL_RENDERBUFFER_OES];
In the layoutSubviews method:
// Allocate color buffer backing based on the current layer size
glBindRenderbufferOES(GL_RENDERBUFFER_OES, colorRenderbuffer);
[context renderbufferStorage:GL_RENDERBUFFER_OES fromDrawable:(CAEAGLLayer*) self.layer];
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);
glGenRenderbuffersOES(1, &depthRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, depthRenderbuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT16_OES, backingWidth, backingHeight);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, depthRenderbuffer);
if (glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) != GL_FRAMEBUFFER_COMPLETE_OES)
{
NSLog(#"Failed to make complete framebuffer object %x", glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) );
return NO;
}
return YES;
[self drawView:nil];
I've done everything it seems I have to do. I don't know why it's so much more complicated for iOS.
Thankyou for any help.
Have a look at the code you can find at this page the author inits an EAGLView subclass parametrizing the method in the case he needs a depth buffer
I might be reading this wrong, but you are defining a clipping box with:
glFrustumf(-xmax, xmax, -0.04142135624, 0.04142135624, 0.1, 2000);
And the object that you're trying to render with:
static const GLfloat squareVertices[] = {
1, 1,-0.1f,
-1, 1,-0.1f,
1, -1,0.1f,
-1, -1,0.1f,
};
At the top you are clipping Z values from 0.1 to 2000.
But your vertices have Z values from -0.1 to 0.1.
These don't see to overlap.
You might want to try moving the object that you want
to look at into the area where you can see it.