BGRA on iPhone glTexImage2D and glReadPixels - iphone

Looking at the docs, I should be able to use BGRA for the internal format of a texture. I am supplying the texture with BGRA data (using GL_RGBA8_OES for glRenderbufferStorage as it seems BGRA there is not allowed). However, the following does not work:
glTexImage2D(GL_TEXTURE_2D, 0, **GL_BGRA**, w, h, 0, GL_BGRA, GL_UNSIGNED_BYTE, buffer);
...
glReadPixels(0, 0, w,h, GL_BGRA, GL_UNSIGNED_BYTE,buffer, 0);
While this gives me a black frame:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w, h, 0, GL_BGRA, GL_UNSIGNED_BYTE, buffer);
...
glReadPixels(0, 0, w,h, **GL_BGRA**, GL_UNSIGNED_BYTE,buffer, 0);
And this does work, but the blues/reds are inverted (I supply BGRA data to the texture):
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA,w, h, 0, GL_BGRA, GL_UNSIGNED_BYTE, buffer);
...
glReadPixels(0, 0, w,h, **GL_RGBA**, GL_UNSIGNED_BYTE,buffer, 0);
...why can't I just use BGRA throughout? I do notice that glRenderbufferStorage does not seem to accept any BGRA formats...I'm really confused. BGRA is the only suitable format my data is in, as it comes from the iphone's camera.

The third parameter to glTexImage2D() is the number of color components in the texture, not the pixel ordering of the texture. You want to use GL_RGBA here, or it just won't work.
I don't believe GL_BGRA is supported by glReadPixels() on iOS devices as a color format. While providing pixel data to the textures in BGRA format is recommended by Apple when processing video image frames, I think you're fine in reading that back in RGBA format and then encoding that to disk, as you've described elsewhere.
If you want to see a sample project that takes camera video frames in BGRA, sends them to a texture, processes them using shaders, and then reads the resulting pixels back, you can check out the one I built here.

My own looking around for OpenGL ES2 indicates that this works for the BGRA little endian native format under iOS. Here "pixels" points to a buffer of 32 bit BGRA that would come from a CGBitmapContextCreate() or grabbing the image data from out of a CGImageRef.
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_BGRA_EXT, GL_UNSIGNED_BYTE, pixels);
See the apple ext for this here.
Good discussion of this issue here

Related

OpenGL ES texture combines instead of replaces (works on device, not on Simulator)

I'm repeatedly rendering a UIView to an OpenGL texture. Things are working well on the device (the texture updates as expected). On the simulator the texture is initially correct (correct alpha and colour) however subsequent updates to the textures seem to combine with existing texture (as if 'pasted' onto existing texture) instead of replacing the existing texture, gradually producing an ugly mess.
Some (possibly) relevant context:
I'm using OpenGL ES 1.1
I'm running Xcode 4.0.2 (Build 4A2002a) on OSX 10.6.8 on a 2007 MBP (Radeon X1600 video)
The project uses iOS SDK 4.3 and the deployment target is iOS 4.0
Here is the code which renders the view to the texture (the same code is responsible for the initial render and subsequent updates).
// render UIView to pixel buffer
GLubyte *pixelBuffer = (GLubyte *)malloc(4 * width * height);
CGColorSpaceRef colourSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pixelBuffer, width, height, 8, 4 * width, colourSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colourSpace);
[view.layer renderInContext:context];
// replace OpenGL texture with pixelBuffer data
glBindTexture(GL_TEXTURE_2D, textureId);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, pixelBuffer);
Initially I was not worried about the difference between Simulator and device, however now I need to make instructional videos using the simulator.
(interestingly the overwriting has additional RGB noise when the simulator device is set to iPhone than when it is set to iPhone (retina))
I came across a similar problem myself. Don't know why there's a difference in implementation between the simulator and device, but what I found worked was making sure the pixel buffer was zeroed before using it. If the texture I was loading had completely transparent pixels, on the simulator it wasn't bothering to set the values for those pixels!
So, try using calloc instead of malloc, which should initialize the memory to 0s. ie something like...
GLubyte *pixelBuffer = (GLubyte *)calloc(4 * width * height, sizeof(GLubyte));
...or memset...
GLubyte *pixelBuffer = (GLubyte *)malloc(4 * width * height);
memset(pixelBuffer, 0, 4 * width * height);

glReadPixels and GL_ALPHA

I'm trying to read the alpha pixel values using glReadPixels. The first thing I did was read the pixels individually. To try to speed things up, I tried reading all the pixels at once :
GLubyte *pixels = new GLubyte[w*h*4];
glReadPixels(0, 0, w, h, GL_RGBA, GL_UNSIGNED_BYTE, pixels);
and it worked, but really slow. Now I'm trying to just retrieve the alpha value, without wasting space to the RGB components :
GLubyte *pixels = new GLubyte[w*h];
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glPixelStorei(GL_PACK_ALIGNMENT, 1);
glReadPixels(0, 0, w, h, GL_ALPHA, GL_UNSIGNED_BYTE, pixels);
But I get : OpenGL error 0x0500 in -[EAGLView swapBuffers].
Any idea as to why a INVALID_ENUM (0x0500) is thrown?
According to the documentation on glReadPixels() for OpenGL ES, the only valid enum values for the format parameter are GL_RGBA and GL_IMPLEMENTATION_COLOR_READ_FORMAT_OES. You'd need to check and see what GL_IMPLEMENTATION_COLOR_READ_FORMAT_OES means as a format for the iPhone, but it may not provide support for GL_ALPHA.
In any case, I doubt that going that route will dramatically speed up your reads, because all that will do is discard the RGB components. Your performance issues with glReadPixels() probably lie elsewhere. A good discussion of the reasons for this can be found in the discussion thread here.
Would it be possible for you to render into an offscreen framebuffer that was backed by a texture, then do further processing on the GPU using that texture? This sounds like it would yield better performance than using glReadPixels().

Rendering A Texture Onto Itself In OpenGL ES

I can comfortably render a scene to a texture and map that texture back onto a framebuffer for screen display. But what if I wanted to map the texture back onto itself in order to blur it (say, at a quarter opacity in a new location). Is that possible?
The way I've done it is simply to enable the texture:
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, color_tex);
And then draw to it:
glVertexPointer(2, GL_FLOAT, 0, sv);
glTexCoordPointer(2, GL_FLOAT, 0, tcb1);
glColor4f (1.0f,1.0f,1.0f,0.25f);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
(some code omitted, obviously)
Is there anything obviously wrong with that idea? Am I being an idiot?
No, you can't write the texture to the same texture, that triggers undefined behaviour.
But you can use a technique called Ping-Pong rendering, so you draw the result of the operation into another texture, and if you need to do more processing, you write the result to the first texture.

iPhone : Texture bigger than 64x64?

I took the example of GLPaint... I'm trying to put a background into the "PaintingView", so you could draw over the background and finally save the image as a file..... I'm lost.
I'm loading the PNG (512x512) and try to "paint with it" at the very beginning of the program, but it's painted as 64x64 instead of 512x512...
I tried before to load is as a subview of the painting view... but then, glReadPixels doesn't work as expected (it only take in consideration the PaintingView, not the subview). Also the PaintingView doesn't have a method as initWithImage... I NEED glReadPixels work on the image (and in the modification) but i really don't know why when i load it, the texture has a 64x64 size..
The GLPaint example project uses GL_POINT_SPRITE to draw copies of the brush texture as you move the brush. On the iPhone, the glPointSize is limited to 64x64 pixels. This is a hardware limitation, and in the simulator I think you can make it larger.
It sounds like you're trying to use a GL_POINT_SPRITE method to draw your background image, and that's really not what you want. Instead, try drawing a flat, textured box that fills the screen.
Here's a bit of OpenGL code that sets up vertices and texcoords for a 2D box and then draws it:
const GLfloat verticies[] = {
0.0f, 0.0f,
1.0f, 0.0f,
0.0f, 1.0f,
1.0f, 1.0f,
};
const GLfloat texcoords[] = {
0, 0,
1, 0,
0, 1,
1, 1,
};
glVertexPointer(2, GL_FLOAT, 0, verticies);
glEnableClientState(GL_VERTEX_ARRAY);
glTexCoordPointer(2, GL_FLOAT, 0, texcoords);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, texture);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
Hope that helps! Note that you need to specify the vertices differently depending on how your camera projection is set up. In my case, I set up my GL_MODELVIEW using the code below - I'm not sure how the GLPaint example does it.
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glOrtho(0, 1.0, 0, 1.0, -1, 1);
First, glReadPixels() is only going to see whatever framebuffer is associated with your current OpenGL context. That might explain why you're not getting the pixels you expect.
Second, what do you mean by the texture being rendered at a specific pixel size? I assume the texture is rendered as a quad, and then the size of that quad ought to be under your control, code-wise.
Also, check that the loading of the texture doesn't generate an OpenGL error, I'm not sure what the iPhone's limitations on texture sizes are. It's quite conceivable that 512x512 is out of range. You could of course investigate this yourself, by calling glGetIntegerv() and using the GL_MAX_TEXTURE_SIZE constant.

Texture2D iPhone SDK openGL

I'm using the Texture2D class in an iPhone game using OpenGL ES.
Are their any good tutorials for understanding the Texture2D class?
Specifically I'm looking at the initWithString method for printing text. As the way it is implemented, you get white text when you use it. I would like to modify the method so I could specify the RGB color of the text. Any help / pointers?
Because the class uses an alpha-only texture (read the code!), it will display in whatever color glColor has set. See this line in initWithData (which gets called by initWithString):
glTexImage2D(GL_TEXTURE_2D, 0, GL_ALPHA,
width, height, 0, GL_ALPHA,
GL_UNSIGNED_BYTE, data);
For red text, just call glColor4ub(255, 0, 0, 255) prior to drawing the texture.
Make sure you enable GL_BLEND and GL_COLOR_MATERIAL prior to drawing.
The class is small. I recommend you just read it.