OpenGL-ES 1.1 iPhone - Depth buffer causes magenta screen - iphone

I have an iPhone OpenGL-ES 1.1 project that renders simple 3D models. If I do not attach a depth buffer, everything renders correctly (except with no depth awareness of course). When I attach the depth buffer however, all that renders is a magenta screen. The clear color is not set to magenta. It is blue. Anyone know what is going on here. This is my setup code =>
glGenFramebuffersOES(1, &framebuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, framebuffer);
glGenRenderbuffersOES(1, &colorRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, colorRenderbuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_RGBA8_OES, screenWidth, screenHeight);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, colorRenderbuffer);
glGenRenderbuffersOES(1, &depthRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, depthRenderbuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT16_OES, screenWidth, screenHeight);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, depthRenderbuffer);
This gives the magenta screen. Commenting out the last 4 lines will cause it to render the objects.

probably your screenWidth / screenHeight aren't initialized yet?
check if the folllowing fixes your problem:
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT16_OES, 320, 480);

Related

Open GL ES Retina Display Setup

I am trying to get an OpenGL project rendering correctly to a iphone retina display and I seem to be running into some difficulty. I have set the content scale factor to the devices but now the framebuffer is failing to be created.
This is all done in a subclass of EAGLView int the createFramebuffer method.
Here is my setup:
glGenFramebuffersOES(1, &viewFramebuffer);
glGenRenderbuffersOES(1, &viewRenderbuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
self.contentScaleFactor = [[UIScreen mainScreen] scale];
[context renderbufferStorage:GL_RENDERBUFFER_OES fromDrawable:(CAEAGLLayer*)self.layer];
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, viewRenderbuffer);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);//Depth
glGenRenderbuffersOES(1, &depthRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, depthRenderbuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT16_OES, backingWidth, backingHeight);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, depthRenderbuffer);
glEnable(GL_DEPTH_TEST);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_BLEND);
NSLog(#"scale Factor: %f",self.contentScaleFactor);
The scale factor is reported correctly but the display will not render and the buffer fails to be created. Am I missing something here?
I've run into the same problem. You just have to multiply the frame size with the screen scale as I have stated in this answer.

OpenGL ES 1.1 wont render textures?

Right, I posted a similar question, Ive tried completely rebuilding the files and what happens now is textures aren't rendering at all. The image is 512x512. For some reason the texture works on simulator but not my iPod.
In a class called EAGLView there is, beginDraw and finishDraw which are called at the beggining and end of my game loop. Layout subviews is called when I create the view.
-(void)beginDraw
{
// Make sure that you are drawing to the current context
[EAGLContext setCurrentContext:context];
glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
// make sure we are in model matrix mode and clear the frame
glMatrixMode(GL_MODELVIEW);
glClear(GL_COLOR_BUFFER_BIT);
// set a clean transform
glLoadIdentity();
}
-(void)finishDraw
{
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER_OES];
}
- (void)layoutSubviews
{
[EAGLContext setCurrentContext:context];
[self destroyFramebuffer];
[self createFramebuffer];
[self setupViewLandscape];
}
- (BOOL)createFramebuffer {
glGenFramebuffersOES(1, &viewFramebuffer);
glGenRenderbuffersOES(1, &viewRenderbuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
[context renderbufferStorage:GL_RENDERBUFFER_OES fromDrawable:(CAEAGLLayer*)self.layer];
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, viewRenderbuffer);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);
if (USE_DEPTH_BUFFER) {
glGenRenderbuffersOES(1, &depthRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, depthRenderbuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT16_OES, backingWidth, backingHeight);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, depthRenderbuffer);
}
if(glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) != GL_FRAMEBUFFER_COMPLETE_OES) {
NSLog(#"failed to make complete framebuffer object %x", glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES));
return NO;
}
return YES;
}
My texturedQuad render method is
-(void)render
{
glVertexPointer(vertexSize, GL_FLOAT, 0, vertexes);
glEnableClientState(GL_VERTEX_ARRAY);
glColorPointer(colorSize, GL_FLOAT, 0, colors);
glEnableClientState(GL_COLOR_ARRAY);
if (materialKey != nil) {
[[MaterialController sharedMaterialController] bindMaterial:materialKey];
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
glTexCoordPointer(2, GL_FLOAT, 0, uvCoordinates);
}
//render
glDrawArrays(renderStyle, 0, vertexCount);
}
Also here is the bindMaterial method:
-(void)bindMaterial:(NSString*)materialKey
{
NSNumber *numberObj = [materialLibrary objectForKey:materialKey];
if (numberObj == nil) return;
GLuint textureID = [numberObj unsignedIntValue];
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, textureID);
}
It is called when my sceneObject is rendered:
(TexturedQuad is a subclass of mesh)
-(void)render
{
if (!mesh || !active) return; // if we do not have a mesh, no need to render
// clear the matrix
glPushMatrix();
glLoadIdentity();
glMultMatrixf(matrix);
[mesh render];
glPopMatrix();
}
And lastly my Test class:
Awake is called when the object is added to the scene
-(void)awake
{
self.mesh = [[MaterialController sharedMaterialController] quadFromAtlasKey:#"boxNotSelected"];
self.scale = BBPointMake(50.0, 50.0, 1.0);
}
Thanks for taking your time to read this and thanks if you offer any help =]
Are you making any calls to glTexParameter anywhere, and are you creating mip maps? It could just be that the GPU is sampling a map you haven't supplied. What happens if you set MIN_FILTER to GL_NEAREST or GL_LINEAR?
Answer turned out to be the even though my texture was called menuAtlas both in the file manager at the side and in the code it wouldn't work so changing both to mA worked. I don't understand why but I have a suspicion that it involves something to do with caching.

how to draw a semi-transparent triangle to back-buffer then render it?

I'm trying to draw a line with semi-transparent color to an offscreen framebuffer, use glCopyTexSubImage2D to copy it into a texture, then draw that texture to onscreen framebuffer.
I tried many configuration but only got an opaque line.
For more information, this is how I setup my OpenGL:
Firstly, I subclass EAGLView, then add an offscreen framebuffer:
glGenFramebuffersOES(1, &offscreenFramebuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, offscreenFramebuffer);
glGenRenderbuffersOES(1, &offscreenRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, offscreenRenderbuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_RGBA8_OES, kTextureOriginalSize, kTextureOriginalSize);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, offscreenRenderbuffer);
glGenRenderbuffersOES(1, &offscreenDepthBuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, offscreenDepthBuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT16_OES, kTextureOriginalSize, kTextureOriginalSize);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, offscreenDepthBuffer);
// Offscreen framebuffer texture target
glGenTextures(1, &offscreenRenderTexture);
glBindTexture(GL_TEXTURE_2D, offscreenRenderTexture);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
unsigned char * data = (unsigned char *)malloc( kTextureOriginalSize * kTextureOriginalSize * 4 );
memset( data,0xff, kTextureSizeWidth * kTextureSizeHeight * 4 );
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, kTextureOriginalSize, kTextureOriginalSize, 0, GL_RGBA, GL_UNSIGNED_BYTE, data);
glFramebufferTexture2DOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_TEXTURE_2D, offscreenRenderTexture, 0);
glEnable( GL_TEXTURE_2D);
glTexEnvf( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
glEnableClientState(GL_VERTEX_ARRAY);
glEnable(GL_BLEND);
glBlendFunc( GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA );
And copy data from offscreen framebuffer to texture
glBindTexture( GL_TEXTURE_2D, offscreenRenderTexture);
glCopyTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, 0, 0, kTextureOriginalSize, kTextureOriginalSize);
I used glColor4f to change the alpha of color before drawing.
Thanks for your help!
I found the answer:
First, I need to set alpha to very
small: like 0.1 to see the effect.
Second, glBlendFunc() has effect
only on active framebuffer, so I need
to call glBlendFunc() twice for
both offscreen and onscreen
framebuffer. Or actually, what I do is
disable GL_BLEND when rendering the
texture to onscreen framebuffer.

What's wrong with using depth render buffer? OpenGL ES 2.0

I use this code:
context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
if (!context || ![EAGLContext setCurrentContext:context] || ![self loadShaders])
{
[self release];
return nil;
}
glGenFramebuffers(1, &defaultFramebuffer);
glGenRenderbuffers(1, &colorRenderbuffer);
glGenRenderbuffers(1, &depthRenderbuffer);
glBindFramebuffer(GL_FRAMEBUFFER, defaultFramebuffer);
glBindRenderbuffer(GL_RENDERBUFFER, colorRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, depthRenderbuffer);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, colorRenderbuffer);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthRenderbuffer);
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
But when I'm runing the application I see purple screen? What's wrong with this code?
I don't see where you bind your color renderbuffer to the to the CAEAGLLayer it is to be displayed on (although that may happen later), and I don't see you enabling depth testing. Also, if I'm not mistaken you need to bind the color renderbuffer, call glFramebufferRenderbuffer() for that, then bind the depth renderbuffer and call it again.
The following is code that I've used to set up a similar display on OpenGL ES 2.0:
glEnable(GL_DEPTH_TEST);
glGenFramebuffers(1, &viewFramebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, viewFramebuffer);
glGenRenderbuffers(1, &viewRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, viewRenderbuffer);
[context renderbufferStorage:GL_RENDERBUFFER fromDrawable:(CAEAGLLayer*)self.layer];
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &backingWidth);
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &backingHeight);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, viewRenderbuffer);
glGenRenderbuffers(1, &depthRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, depthRenderbuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, backingWidth, backingHeight);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthRenderbuffer);
if(glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE)
{
NSLog(#"Failure with framebuffer generation");
return NO;
}
The full code of this application is available here.
I realize how old this is, but I did the exact same thing with the exact same results so I thought I would contribute. What had happened in my project was I hit the enter key too quickly on the auto-complete when setting up the render buffer, resulting in this:
glGenBuffers(1, &_colorRenderBuffer);
instead of this:
glGenRenderbuffers(1, &_colorRenderBuffer);
End result: purple screen.

Draw to offscreen renderbuffer in OpenGL ES (iPhone)

I'm trying to create an offscreen render buffer in OpenGL ES on the iPhone. I've created the buffer like this:
glGenFramebuffersOES(1, &offscreenFramebuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, offscreenFramebuffer);
glGenRenderbuffersOES(1, &offscreenRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, offscreenRenderbuffer);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, offscreenRenderbuffer);
But I'm confused on how to render the storage. Apple's documentation says to use the EAGLContext renderBufferStorage:fromDrawable: method, but this seems to only work for one render buffer (the main one being displayed). If I use the normal OpenGL function glRenderBufferStorageOES, then I can't seem to get it to display. Here's the code:
// this is in the initialization section:
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_RGB8_OES, backingWidth, backingHeight);
// and this is when I'm trying to draw to it and display it:
glBindFramebufferOES(GL_FRAMEBUFFER_OES, offscreenFramebuffer);
GLfloat vc[] = {
0.0f, 0.0f, 0.0f,
10.0f, 10.0f, 10.0f,
0.0f, 0.0f, 0.0f,
-10.0f, -10.0f, -10.0f,
};
glLoadIdentity();
glEnableClientState(GL_VERTEX_ARRAY);
glVertexPointer(3, GL_FLOAT, 0, vc);
glDrawArrays(GL_LINES, 0, 4);
glDisableClientState(GL_VERTEX_ARRAY);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, offscreenRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER_OES];
Doing it this way, nothing is displayed on the screen. However, if I switch out the references to "offscreen...Buffer" to the buffers that were created with the renderBufferStorage method, it works fine.
Any suggestions?
Since you can't use presentRenderbuffer with an offscreen FBO, you should associate it with a texture object using glFramebufferTexture2DOES, then render a textured full-screen quad.
#david good idea.. what you need to do is what #prideout said.. create a texture and render to it.. and use the texture on a quad every time. Make sure you draw to the texture only once, as in your case things are persistent.
- (void)setUpTextureBuffer
{
glGenFramebuffersOES(1, &texturebuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, texturebuffer);
// create the texture
glGenTextures(1, &canvastexture);
glBindTexture(GL_TEXTURE_2D, canvastexture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 512, 512, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
glFramebufferTexture2DOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_TEXTURE_2D, canvastexture, 0);
GLenum status = glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES);
if(status != GL_FRAMEBUFFER_COMPLETE_OES) {
NSLog(#"failed to make complete framebuffer object %x", status);
}
glTexParameterf( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
glTexParameterf( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );
glClearColor(1.0, 1.0, 1.0, 1.0);
glViewport(0, 0, 512, 512);
glClear(GL_COLOR_BUFFER_BIT);
}
//setTargetToTexture() function
glBindFramebufferOES(GL_FRAMEBUFFER_OES, tbuffer);
glBindTexture(GL_TEXTURE_2D, allbrushes);
glViewport(0, 0, 512, 512);
//reset pointers after finishing drawing to textures
glViewport(0, 0, BWIDTH, BHEIGHT);
glVertexPointer(2, GL_FLOAT, 0, canvas); //canvas vertices
glTexCoordPointer(2, GL_FLOAT, 0, texels);
glBindTexture(GL_TEXTURE_2D, boundtexture); //bind to the texture which is the special render target
glBindFramebufferOES(GL_FRAMEBUFFER_OES, fbuffer); //back to normal framebuffer
You cannot present an normal renderbuffer (created with glRenderbufferStorage), it is always offscreen. presentRenderbuffer: can only be used for renderbuffers that were created using the renderbufferStorage:fromDrawable:. If you checked the return value of that presentRenderbuffer:, you should observe it failing.
What are you trying to do?