OpenGL ES 2.0 on iOS non-square texture gets distorted when rendered - iphone

I am working on OpenGL ES 2.0 on iOS and I have this...
A texture (T1) with some image (size 1024x768).
An off screen FBO (FBO1) associated texture T1 so that I can render something into T1.
Another texture (T2) of the same size.
Another FBO (FBO2) associated with T2. (I will render to texture T2 with it).
Now in a loop, I render the content of T1 into FBO2 (so T1 gets transferred to T2) and then render T2 into FBO1 (T2 gets transferred back into T1).
After couple of iterations (just 7-8), the original image which was loaded to T1 gets severely distorted/blurred (as if it did a horizontal blur).
But if I do the same thing with a square image (need not be power of two), the image remains clear.
Here is the code that I use to create the texture..
GLuint texture = 0;
glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
And here is the code I use to create the FBO...
glGenFramebuffers(1, &targetFBO);
glBindFramebuffer(GL_FRAMEBUFFER, targetFBO);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, texture, 0);
My shaders are the most basic ones, just renders the texture into gl_FragColor.
Here is my render Objective-C code...
- (void) draw {
GLfloat textureCoordinates[] = {
0.0f, 0.0f,
1.0f, 0.0f,
0.0f, 1.0f,
1.0f, 1.0f,
};
GLfloat imageVertices[] = {
-1.0f, -1.0f,
1.0f, -1.0f,
-1.0f, 1.0f,
1.0f, 1.0f,
};
glViewport(0, 0, targetWidth, targetHeight);
glBindFramebuffer(GL_FRAMEBUFFER, targetFBO);
glUseProgram(program);
glActiveTexture(GL_TEXTURE2);
glBindTexture(GL_TEXTURE_2D, textureID);
glUniform1i(inputTextureLocation, 2);
glDisable(GL_BLEND);
glVertexAttribPointer(positionLocation, 2, GL_FLOAT, 0, 0, imageVertices);
glVertexAttribPointer(inputTextureCoordinateLocation, 2, GL_FLOAT, 0, 0, textureCoordinates);
// Clear the screen
glClearColor(0.0, 0.0, 0.0, 0.0);
glClear(GL_COLOR_BUFFER_BIT);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
}
And my vertex shader...
attribute vec4 position;
attribute vec4 inputTextureCoordinate;
varying vec2 textureCoordinate;
void main()
{
gl_Position = position;
textureCoordinate = inputTextureCoordinate.xy;
}
Fragment shader...
varying highp vec2 textureCoordinate;
uniform sampler2D inputTexture;
void main()
{
gl_FragColor = texture2D(inputTexture, textureCoordinate);
}
Original image...
Distorted image ...
How do I avoid blurring on non-square textures?
Edit: The problem is actually with No-Power-of-Two textures. Not just non-square. I am not able to find a solution, so I am going with POT textures.

Related

Drawing to the alpha channel of the frame buffer on with OpenGL while ignoring RGB values?

The opaque property is set to NO on my OpenGL view's CAEAGLLayer. There is a UIImageView behind the OpenGL view (the checkboard). A texture is drawn initially on the OpenGL view (the Emu bird photo). Now I want to draw another texture in the middle of the fame buffer. The new texture is completely black in color and the alpha changes from 0 to 255, from top to bottom.
This is my 2nd texture...
This is what I want...
This is what I get...
The checkboard texture is an UIImage in a UIImageView behind the EAGLView.
I do not want to disturb the RGB values in the frame buffer, I only want to write into the Alpha channel. I tried...
glDisable(GL_BLEND) and glColorMask(0, 0, 0, 1)
glEnable(GL_BLEND) and glBlendFuncSeparate(GL_ZERO, GL_ONE, GL_ONE, GL_ZERO)
Nothing seems to work. The RGB values are always modified, the pixels become brighter.
If source RGBA is (r2, g2, b2, a2) and destination is (r1, g1, b1, a1). I want the final value to be (r1, g1, b1, a2) or preferably (r1, g1, b1, some_function_of(a1, a2)). How do I achieve this?
Here is my code...
Objective C code for drawing the second texture...
- (void) drawTexture {
GLfloat textureCoordinates[] = {
0.0f, 1.0f,
1.0f, 1.0f,
0.0f, 0.0f,
1.0f, 0.0f,
};
static const GLfloat imageVertices[] = {
-0.5f, -0.5f,
0.5f, -0.5f,
-0.5f, 0.5f,
0.5f, 0.5f,
};
[EAGLContext setCurrentContext:context];
glViewport(0, 0, backingWidth, backingHeight);
glUseProgram(program);
glActiveTexture(GL_TEXTURE2);
glBindTexture(GL_TEXTURE_2D, texture);
glEnable (GL_BLEND);
glBlendFuncSeparate(GL_ZERO, GL_ONE, GL_ONE, GL_ZERO);
glUniform1i(inputImageTexture, 2);
glVertexAttribPointer(position, 2, GL_FLOAT, 0, 0, imageVertices);
glVertexAttribPointer(inputTextureCoordinate, 2, GL_FLOAT, 0, 0, textureCoordinates);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
glBindRenderbuffer(GL_RENDERBUFFER, viewRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER];
}
Vertex shader...
attribute vec4 position;
attribute vec4 inputTextureCoordinate;
varying vec2 textureCoordinate;
void main()
{
gl_Position = position;
textureCoordinate = inputTextureCoordinate.xy;
}
Fragment shader...
varying highp vec2 textureCoordinate;
uniform sampler2D inputImageTexture;
void main()
{
gl_FragColor = texture2D(inputImageTexture, textureCoordinate);
}
glColorMask must work - it worked for me in my case. Also, as far as I know glBlendFuncSeparate works buggy on iOS.
(In theory) you can do the following:
set blending to multiply mode (GL_DST_COLOR, GL_ONE_MINUS_SRC_ALPHA)
make your texture white
set glColorMask to all channels enabled
If you have iDevice with iOS 5+ you can look at framebuffer alpha in OpenGL Capture window in XCode.

Draw a line on top of triangles

I created a new iPhone OpenGL Project in Xcode. I filled my background with triangles and gave them a texture, see below:
CGImageRef spriteImage;
CGContextRef spriteContext;
GLubyte *spriteData;
size_t width, height;
// Sets up matrices and transforms for OpenGL ES
glViewport(0, 0, backingWidth, backingHeight);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
//glRotatef(-90,0,0,1);
glOrthof(-1.0f, 1.0f, -1.5f, 1.5f, -1.0f, 1.0f);
glMatrixMode(GL_MODELVIEW);
// Clears the view with black
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
// Sets up pointers and enables states needed for using vertex arrays and textures
glVertexPointer(2, GL_FLOAT, 0, vertices);
glEnableClientState(GL_VERTEX_ARRAY);
//glColorPointer(4, GL_FLOAT, 0, triangleColors);
//glColor4f(0.0f,1.0f,0.0f,1.0f);
//glEnableClientState(GL_COLOR_ARRAY);
glTexCoordPointer(2, GL_FLOAT, 0, spriteTexcoords);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
// Creates a Core Graphics image from an image file
spriteImage = [UIImage imageNamed:#"Bild.png"].CGImage;
// Get the width and height of the image
width = CGImageGetWidth(spriteImage);
height = CGImageGetHeight(spriteImage);
// Texture dimensions must be a power of 2. If you write an application that allows users to supply an image,
// you'll want to add code that checks the dimensions and takes appropriate action if they are not a power of 2.
if(spriteImage) {
// Allocated memory needed for the bitmap context
spriteData = (GLubyte *) calloc(width * height * 4, sizeof(GLubyte));
// Uses the bitmap creation function provided by the Core Graphics framework.
spriteContext = CGBitmapContextCreate(spriteData, width, height, 8, width * 4, CGImageGetColorSpace(spriteImage), kCGImageAlphaPremultipliedLast);
// After you create the context, you can draw the sprite image to the context.
CGContextDrawImage(spriteContext, CGRectMake(0.0, 0.0, (CGFloat)width, (CGFloat)height), spriteImage);
// You don't need the context at this point, so you need to release it to avoid memory leaks.
CGContextRelease(spriteContext);
// Use OpenGL ES to generate a name for the texture.
glGenTextures(1, &spriteTexture);
// Bind the texture name.
glBindTexture(GL_TEXTURE_2D, spriteTexture);
// Set the texture parameters to use a minifying filter and a linear filer (weighted average)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
// Specify a 2D texture image, providing the a pointer to the image data in memory
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, spriteData);
// Release the image data
free(spriteData);
// Enable use of the texture
glEnable(GL_TEXTURE_2D);
// Set a blending function to use
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
// Enable blending
glEnable(GL_BLEND);
I have got two questions, bc. I am not so familiar with OpenGL.
I want to write a method, which I give two points as parameters and I want a Line between these two points to be drawn above my triangles (background).
- (void) drawLineFromPoint1:(CGPoint)point1 toPoint2:(CGPoint)point2 {
GLfloat triangle[] = { //Just example points
0.0f, 0.0f,
0.1f, 0.0f,
0.1f, 0.0f,
0.1f, 0.1f
};
GLfloat triangleColors[] = {
0.5f, 0.5f, 0.5f, 1.0f
};
//now draw the triangle
}
Something like that. Now I want to have a 2nd method, which erases this line (and not the background)
My drawing method looks like this:
- (void)drawView
{
// Make sure that you are drawing to the current context
[EAGLContext setCurrentContext:context];
glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
glClear(GL_COLOR_BUFFER_BIT);
glDrawElements(GL_TRIANGLES, number_vertices, GL_UNSIGNED_SHORT, indices);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER_OES];
}
Would be great if you can give e some hints/help,
cheers
The conventional approach would be to redraw everything whenever you move or erase a line.
Well, I got it to work. I just missed to set the Vertex-Pointer in my drawView to my triangles. This here now works:
- (void)drawView
{
[EAGLContext setCurrentContext:context];
glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
glClear(GL_COLOR_BUFFER_BIT);
glVertexPointer(2, GL_FLOAT, 0, vertices);
glDrawElements(GL_TRIANGLES, number_vertices, GL_UNSIGNED_SHORT, indices);
[self drawLines];
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER_OES];
}
- (void) drawLines{
glDisable(GL_TEXTURE_2D);
GLfloat points[4];
for (Dataset *data in buttons) {
CGPoint s = [data screenPosition];
CGPoint p = [data slot];
points[0] = (GLfloat)(768-s.y);
points[1] = (GLfloat)(1024-s.x);
points[2] = (GLfloat)(768-p.y);
points[3] = (GLfloat)(1024-p.x);
glVertexPointer(2, GL_FLOAT, 0, points);
glDrawArrays(GL_LINE_STRIP, 0, 2);
}
glEnable(GL_TEXTURE_2D);
}

Why can't my fragment shader read alpha information when I switch to an alpha-only (A8) pixelFormat?

I'm creating an iOS app that uses OpenGL ES 2.0. I'm somewhat new to OpenGL, so this may be a trivial mistake.
I've created a simple shader to handle masking one texture with another, using the alpha channel. The mask texture initialized like so:
glGenTextures(1, &name);
glBindTexture(GL_TEXTURE_2D, name);
glTexParameterf(GL_TEXTURE_2D, GL_GENERATE_MIPMAP, GL_TRUE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE );
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 1024, 768, 0, GL_RGBA, GL_UNSIGNED_BYTE, 0);
glGenFramebuffersOES(1, &buffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, buffer);
glFramebufferTexture2DOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_TEXTURE_2D, name, 0);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, 0);
I then (later) draw some stuff to the mask texture, and link the mask texture to this fragment shader to apply it to another texture:
uniform sampler2D baseTextureSampler;
uniform sampler2D maskTextureSampler;
varying lowp vec4 colorVarying;
varying mediump vec2 textureCoordsVarying;
varying mediump vec2 positionVarying;
void main() {
lowp vec4 texel = texture2D(baseTextureSampler, textureCoordsVarying);
lowp vec4 maskTexel = texture2D(maskTextureSampler, positionVarying);
gl_FragColor = texel*colorVarying*maskTexel.a;
}
This renders exactly how I want it to. However, to reduce the overhead in binding the mask texture's buffer (which seems substantial) I'm trying to use an 8-bit alpha-only pixel format for the mask texture. BUT, if I change the code from the RGBA setup:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 1024, 768, 0, GL_RGBA, GL_UNSIGNED_BYTE, 0);
...to alpha-only:
glTexImage2D(GL_TEXTURE_2D, 0, GL_ALPHA, 1024, 768, 0, GL_ALPHA, GL_UNSIGNED_BYTE, buffer);
...the fragment shader fails to draw anything. Since I'm only using the alpha from the mask texture in the first place, it seems like it should continue to work. Any ideas why it doesn't?
The OpenGL ES 2.0 spec (or the GL_OES_framebuffer_object extension) doesn't define any renderable formats with only alpha bits, so they are NOT supported. You will have to use a RGBA format.

OpenGL ES Framebuffer weird mirroring when drawing

I really can't wrap my mind around this:
Previously I couldn't get Framebuffers to work, but I've got it going now. However, there is this incredibly weird mirroring going on with the texture generated from the framebuffer, and I have no idea why. Basically, I will try to draw a texture at 0,0 using GL_TRIANGLE_FAN, and the texture appears as normal (more or less) in the top right corner, but also appears in the bottom left corner, mirrored. If I fill up most or all of my viewport area with the same texture, the result is an ugly z-fighting overlap.
Screenshots will do this more justice.
Original image:
Original http://img301.imageshack.us/img301/1518/testsprite.png
Drawn 80x80 at (0,0)
80x80 http://img407.imageshack.us/img407/8339/screenshot20100106at315.png
Drawn 100x180 at (0,0)
100x180 http://img503.imageshack.us/img503/2584/screenshot20100106at316.png
Drawn 320x480 at (0,0)
320x480 http://img85.imageshack.us/img85/9172/screenshot20100106at317.png
And here is my code:
Set up the view:
//Apply the 2D orthographic perspective.
glViewport(0,0,320,480);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrthof(0, 320, 480, 0, -10000.0f, 100.0f);
//Disable depth testing.
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glDisable(GL_DEPTH_TEST);
//Enable vertext and texture coordinate arrays.
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
glShadeModel(GL_SMOOTH);
glClearColor(0.5f, 0.5f, 0.5f, 1.0f);
glGetError(); // Clear error codes
sprite = [Sprite createSpriteFromImage:#"TestSprite.png"];
[sprite retain];
[self createTextureBuffer];
Create the texture buffer.
- (void) createTextureBuffer
{
// generate texture
glGenTextures(1, &bufferTexture);
glBindTexture(GL_TEXTURE_2D, bufferTexture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 512, 512, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL); // check if this is right
// generate FBO
glGenFramebuffersOES(1, &framebuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, framebuffer);
// associate texture with FBO
glFramebufferTexture2DOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_TEXTURE_2D, bufferTexture, 0);
// clear texture bind
glBindTexture(GL_TEXTURE_2D,0);
// check if it worked (probably worth doing :) )
GLuint status = glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES);
if (status != GL_FRAMEBUFFER_COMPLETE_OES)
{
printf("FBO didn't work...");
}
}
Run the render loop.
- (void)drawView
{
[self drawToTextureBuffer];
// Make sure that you are drawing to the current context
[EAGLContext setCurrentContext:context];
//Bind the GLView's buffer.
glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
glViewport(0, 0, 320, 480);
//Clear the graphics context.
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
//Push the matrix so we can keep it as it was previously.
glPushMatrix();
//Rotate to match landscape mode.
glRotatef(90.0, 0, 0, 1);
glTranslatef(0.0f, -320.0f, 0.0f);
//Store the coordinates/dimensions from the rectangle.
float x = 0.0f;
float y = 0.0f;
float w = 480.0f;
float h = 320.0f;
// Set up an array of values to use as the sprite vertices.
GLfloat vertices[] =
{
x, y,
x, y+h,
x+w, y+h,
x+w, y
};
// Set up an array of values for the texture coordinates.
GLfloat texcoords[] =
{
0, 0,
0, 1,
1, 1,
0, 1
};
//Render the vertices by pointing to the arrays.
glVertexPointer(2, GL_FLOAT, 0, vertices);
glTexCoordPointer(2, GL_FLOAT, 0, texcoords);
// Set the texture parameters to use a linear filter when minifying.
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
//Enable 2D textures.
glEnable(GL_TEXTURE_2D);
//Bind this texture.
glBindTexture(GL_TEXTURE_2D, bufferTexture);
//Finally draw the arrays.
glDrawArrays(GL_TRIANGLE_FAN, 0, 4);
//Restore the model view matrix to prevent contamination.
glPopMatrix();
GLenum err = glGetError();
if (err != GL_NO_ERROR)
{
NSLog(#"Error on draw. glError: 0x%04X", err);
}
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER_OES];
}
On the first pass, the render loop will draw the image into the FBO.
- (void)drawToTextureBuffer
{
if (!bufferWasCreated)
{
// render to FBO
glBindFramebufferOES(GL_FRAMEBUFFER_OES, framebuffer);
// set the viewport as the FBO isn't be the same dimension as the screen
glViewport(0, 0, 512, 512);
glPushMatrix();
//Store the coordinates/dimensions from the rectangle.
float x = 0.0f;
float y = 0.0f;
float w = 320.0f;
float h = 480.0f;
// Set up an array of values to use as the sprite vertices.
GLfloat vertices[] =
{
x, y,
x, y+h,
x+w, y+h,
x+w, y
};
// Set up an array of values for the texture coordinates.
GLfloat texcoords[] =
{
0, 0,
0, 1,
1, 1,
1, 0
};
//Render the vertices by pointing to the arrays.
glVertexPointer(2, GL_FLOAT, 0, vertices);
glTexCoordPointer(2, GL_FLOAT, 0, texcoords);
// Set the texture parameters to use a linear filter when minifying.
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
//Allow transparency and blending.
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
//Enable 2D textures.
glEnable(GL_TEXTURE_2D);
//Bind this texture.
glBindTexture(GL_TEXTURE_2D, sprite.texture);
//Finally draw the arrays.
glDrawArrays(GL_TRIANGLE_FAN, 0, 4);
//Restore the model view matrix to prevent contamination.
glPopMatrix();
GLenum err = glGetError();
if (err != GL_NO_ERROR)
{
NSLog(#"Error on draw. glError: 0x%04X", err);
}
//Unbind this buffer.
glBindFramebufferOES(GL_FRAMEBUFFER_OES, 0);
bufferWasCreated = YES;
}
}
Your texcoords in - (void)drawView seem to be wrong
GLfloat texcoords[] =
{
0, 0,
0, 1,
1, 1,
0, 1 << HERE should be 1, 0
};

OpenGL ES Render to Texture

I have been having trouble finding straightforward code to render a scene to a texture in OpenGL ES (specifically for the iPhone, if that matters). I am interested in knowing the following:
How do you render a scene to a texture in OpenGL ES?
What parameters must you use to create a texture that is capable of being a render target in OpenGL ES?
Are there any implications with applying this rendered texture to other primitives?
This is how I'm doing it.
I define a texture variable (I use Apple's Texture2D class, but you can use an OpenGL texture id if you want), and a frame buffer:
Texture2d * texture;
GLuint textureFrameBuffer;
Then at some point, I create the texture, frame buffer and attach the renderbuffer. This you only need to do it once:
texture = [[Texture2D alloc] initWithData:0
pixelFormat:kTexture2DPixelFormat_RGB888
pixelsWide:32
pixelsHigh:32
contentSize:CGSizeMake(width, height)];
// create framebuffer
glGenFramebuffersOES(1, &textureFrameBuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, textureFrameBuffer);
// attach renderbuffer
glFramebufferTexture2DOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_TEXTURE_2D, texture.name, 0);
// unbind frame buffer
glBindFramebufferOES(GL_FRAMEBUFFER_OES, 0);
Every time I want to render to the texture, I do:
glBindFramebufferOES(GL_FRAMEBUFFER_OES, textureFrameBuffer);
...
// GL commands
...
glBindFramebufferOES(GL_FRAMEBUFFER_OES, 0);
About your question 3, that's it, you can use the texture as if it is any other texture.
To render the scene to a texture you must use a framebuffer associated with a texture. Here is a method that i created to simplify it :
void glGenTextureFromFramebuffer(GLuint *t, GLuint *f, GLsizei w, GLsizei h)
{
glGenFramebuffers(1, f);
glGenTextures(1, t);
glBindFramebuffer(GL_FRAMEBUFFER, *f);
glBindTexture(GL_TEXTURE_2D, *t);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w, h, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, *t, 0);
GLuint depthbuffer;
glGenRenderbuffers(1, &depthbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, depthbuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, w, h);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthbuffer);
GLenum status = glCheckFramebufferStatus(GL_FRAMEBUFFER);
if(status != GL_FRAMEBUFFER_COMPLETE)
NSLog(#"Framebuffer status: %x", (int)status);
}
You can create the frame buffer and the texture easily :
GLuint _texture, _framebuffer;
GLsizei w,h;
float scale = [UIScreen mainScreen].scale;
w = self.view.bounds.size.width * scale;
h = self.view.bounds.size.height * scale;
glGenTextureFromFramebuffer(&_texture, &_framebuffer, w, h);
You can later use _framebuffer to render the scene into _texture in your draw method :
glBindFramebuffer(GL_FRAMEBUFFER, _framebuffer);
//draw here the content you want in the texture
//_texture is now a texture with the drawn content
//bind the base framebuffer
glBindFramebuffer(GL_FRAMEBUFFER, 0);
//or if you use GLKit
[view bindDrawable];
//draw normaly
Now you can do what you want with the texture. If you want to do some post processing (blur, bloom, shadow, etc...) you can !