openGL render to texture in iPhone fails when MSAA is enabled - iphone

My render to texture iPhone code only works if I disable MSAA, otherwise all I get is a black texture. What could be the cause of the problem?
Here is my code:
glViewport(0,0, target->_Width, target->_Height);
glClear(GL_COLOR_BUFFER_BIT Or GL_DEPTH_BUFFER_BIT Or GL_STENCIL_BUFFER_BIT);
glBindTexture(GL_TEXTURE_2D, target->_Handle);
// render stuff here
glCopyTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 0, 0, target->_Width, target->_Height, 0);
glBindTexture(GL_TEXTURE_2D, 0);

Apparently, when you are using MSAA for your main framebuffer, you have to use it for any other FBOs you want to render to as well. Since GL_TEXTURE_2D_MULTISAMPLE is not available on OpenGL ES 2, the solution I have found is quite simply to apply the same modifications you need to go from regular rendering to MSAA rendering, to your render-to-texture code as well.
You need 3 additional buffers: a multi-sampled color renderbuffer, a multi-sampled depth renderbuffer, and a new FBO to attach them to. Bind the new FBO instead of the texture FBO before rendering. After rendering, resolve the new MSAA FBO into the texture FBO, the same way you do in your main rendering code using glResolveMultisampleFramebufferAPPLE().
Note that for some reason, texture rendering with enabled MSAA works without these modifications in the simulator. Maybe it uses GL_TEXTURE_2D_MULTISAMPLE automatically?

The cause of your problem is that you cannot render to a multisampled texture in OpenGL ES. Indeed, if I recall correctly, multisampled textures don't exist in OpenGL ES. Desktop OpenGL allows you to do it, but it introduces a whole new texture target (GL_TEXTURE_2D_MULTISAMPLE) in order to do it.
A buffer that offers multisampling is not the same thing as a regular texture; that's why Desktop OpenGL uses a special texture target, which has its own GLSL sampler type.

Related

Rendering Technique in OpenGL

I want to enable rendering in specified points (I am holding the collection of points in array) and disable rendering on rest of OpenGL layer
can any one help me to get out of this problem for iPhone!!!!
If you're using OpenGL ES 2.0, you can use the stencil buffer to mask those areas.

Displaying images in open gl es 2.0 on iPhone

So I just started working on something in Open GL ES 2.0. I get the general impression that the switch from the 1.1 template to the 2.0 template in Xcode has caused some confusion for everyone, so as a result, there isn't much helpful on 2.0 (if there is anything really good and informative out there like 71squared's videos on the 1.1 template except for 2.0, your welcome to post a link to it).
My problem is displaying an image on the screen.
Right now, I've got this in my drawFrame method.
[(EAGLView *)self.view setFramebuffer];
// Replace the implementation of this method to do your own custom drawing.
static float transY = 0.0f;
glClearColor(0.0f, 1.0f, 0.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
[playerCube drawAtPoint:CGPointMake(160.0f, 240.0f)];
if ([context API] == kEAGLRenderingAPIOpenGLES2)
{
// Use shader program.
glUseProgram(program);
// Update uniform value.
glUniform1f(uniforms[UNIFORM_TRANSLATE], (GLfloat)transY);
transY += 0.075f;
// Validate program before drawing. This is a good check, but only really necessary in a debug build.
// DEBUG macro must be defined in your debug configurations if that's not already the case.
if (![self validateProgram:program])
{
NSLog(#"Failed to validate program: %d", program);
return;
}
}
else
{
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glTranslatef(0.0f, (GLfloat)(sinf(transY)/2.0f), 0.0f);
transY += 0.075f;
}
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
[(EAGLView *)self.view presentFramebuffer];
Essentially the template without the square bouncing around. This is in my ViewController.m by the way. That works fine on it's own. I've not had much luck with the code for displaying the image however. I'm using the Texture2D file from Apple's crash landing example, but I haven't had much luck with that no matter where the image related code is put.
I've used playerCube = [[Texture2D alloc] initWithImage:[UIImage imageNamed:#"Cubix.png"]]; to allocate it, which is probably right even if I've not been putting it in the right location, and [playerCube drawAtPoint:CGPointMake(160.0f, 240.0f)];.
Wait, that wouldn't show anything at the start cuz I've not set the height and width of the image right? I guess I sort of answered that part of my question. Would it work if I used something like image.frame = CGRectMake (); or would that not work with Texture 2D? Also where should I be putting this code for it to work as it would?
Hope I made some sense in this. I've never posted a question on StackOverflow before.
I show how to display an image as a texture in OpenGL ES 2.0 on the iPhone in this example, where the image is a frame of video from the camera, and this example, where I pull in a PVR-compressed texture as an image. The former example is described in the writeup here.
I describe how the vertex and fragment shaders work, along with all the supporting code for them, in the video for the OpenGL ES 2.0 class which is part of my freely available iOS development course on iTunes U. Additionally, I highly recommend you read Jeff LaMarche's series of posted chapters from his unpublished OpenGL ES 2.0 book.
In short, once you have the texture loaded in, you'll need to attach it as a uniform to your shader program using code like the following:
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, myTexture);
glUniform1i(myTextureUniform, 0);
Within your fragment shader, you'll need to define the texture uniform:
uniform sampler2D myTexture;
and then sample the color from the texture at the appropriate point:
gl_FragColor = texture2D(myTexture, textureCoordinate);
Again, I go into this in more detail within my class, and you can use these examples as starting points to work from.
Texture2D, as presented in Crash Landing, isn't ging to work with ES 2.x. It should correctly upload your texture, but the two drawing methods, drawAtPoint: and drawInRect: won't work. They rely on the GL calls glVertexPointer and glTexCoordPointer; neither of those survives into ES 2.x. The logic behind that is that ES 1.x has a fixed pipeline, so it has a defined slot for vertex positions (which you supply with glVertexPointer), a defined slot for texture coordinates (via glTexCoordPointer) and a series of predefined ways in which that data can end up being processed. ES 2.x is fully programmable so it has no need to supply one way for providing vertices, another for providing texture coordinates, a third for colours, etc. It has just one way to supply an attribute for a vertex and it is up to you to specify how that relates to your vertex program, then up to your vertex program to figure out what needs to be processed and passed on to your fragment shader.
Really you shouldn't be conflating providing textures and providing geometry anyway. Crash Landing was withdrawn because it suggests poor programming patterns — my personal suspicion is that this was one of them.
In addition to not having fixed attributes of a vertex, ES 2.x doesn't supply any fragment shaders. If you want to texture map, you have to write suitable vertex and fragment shaders. The ones in the new GL template do Gouraud shading only if I recall.
An additional issue hinted at in the code is that ES 2.x doesn't have a matrix stack. You need to do something yourself about communicating to your vertex shader how it should map from the original vertices to screen locations. I suspect that if you just want to draw a texture as though 2d, that isn't so much of a problem.
Probably the correct route forward is:
modify Texture2D so that it can bind the texture it has loaded and remove all attempts to draw it
modify the vertex and fragment shader supplied in the GL template to do texturing and modify the buffer you submit so that it has texture coordinates in it
then modify the vertex shader so that you can do something other than a sine animation
If you're already happy with ES 1.x, probably a good place to start is the ES 2.x reference sheet and maybe the lighthouse3d GLSL tutorial — though that's for desktop OpenGL 2.0. On desktop GL 2.0 there's a lot of support built into GLSL for emulating the old fixed pipeline. None of that is in ES 2.x. But it should at least explain things about attributes, uniforms, samplers, etc. I'm afraid I've yet to come across an ES 2.x focussed tutorial.

ES 2.0 Multi-Pass & Render to Texture Implementation

I need help setting up multi-pass rendering with OpenGL ES 2.0 on the iPhone. I haven't been able to find an example which implements both rendering to a texture and multi-pass shading.
I'm looking for some instructions and sample code which implement:
First stage: Render to a texture
Second stage: Input that texture and render to screen
I have referenced Apple's OpenGL ES Programming Guide, OpenGL Shading Language (Orange Book), and O'Reilly's iPhone 3D Programming Book.
The Orange Book discusses deferred shading and provides two shader programs for first-pass and second-pass rendering, but doesn't provide example code to setup that application or show how to communicate data between both shaders.
Questions:
How to render to texture?
Using glDrawElements
How to input that texture to the next pass?
How to implement two shading programs?
How to alternate first- and second-pass shading programs?
Need to attach, detach, and call 'use' for each pass?
How to implement multi-pass shading?
I wrote a short example of doing just this (multiple render-to-texture passes on the iPhone using OpenGL ES 2.0) a few weeks ago: http://www.mat.ucsb.edu/a.forbes/blog/?p=245
**
Edit, this post is a bit old, and it has moved here:
http://blog.angusforbes.com/openglglsl-render-to-texture/
**
Ok, first of all: I'm no expert on OpenGL ES 2.0. I was kind of in the same situation where I wanted to do a multipass render setup, in one of my first OpenGL ES applications.
I also used the Orange Book. Check chapter 12. Framebuffer Objects > Examples. The first example demonstrates how to use a framebuffer to render to a texture, and then draws that texture to screen.
Basically using that example I created an application that renders some geometry to a texture using an effect shader, then renders that texture to screen, layered with some other content all using a different shader.
I'm not sure if this is the best approach, but it works for my purposes. My setup:
I create two framebuffers, the default and an offscreen one. Same for the renderbuffers
I create a texture which the app will render to
I bind the offscreen framebuffer, and attach the texture to it using glFramebufferTexture2D
My rendering:
bind the offscreen framebuffer.
use my first shader program
draw my geometry
bind the default framebuffer
use my second shader program
draw a fullscreen quad with the texture attached to it.

glFramebufferTexture2D performance

I'm doing heavy computation using the GPU, which involves a lot of render-to-texture operations. It's an iterative computation, so there's a lot of rendering to a texture, then rendering that texture to another texture, then rendering the second texture back to the first texture and so on, passing the texture through a shader each time.
My question is: is it better to have a separate FBO for each texture I want to render into, or should I rather have one FBO and bind the target texture using glFramebufferTexture2D each time I want to change render target?
My platform is OpenGL ES 2.0 on the iPhone.
On the iPhone implementation, it is inexpensive to change the attachment, assuming the old and new textures are the same dimensions/format/etc. Otherwise, the driver has to do some additional work to reconfigure the framebuffer.
AFAIK, better performance is achieved by using only one FBO, and changing the texture attachments as necessary.
The best way is to do benchmark.

Fullscreen texture iPhone OpenGL ES

I'm aware that OpenGL textures on the the iphone are required to be a power of 2, is this true of OpenGL 2.0 as well? If I have an image that is 320 x 480 in size and want to draw it full screen is there any possible way to do this with OpenGL.
Thanks
NPOT textures are supported on PowerVR SGX hardware, but have restrictions. NPOT textures cannot use mipmaps, must be 2D (no cube-maps or 3D textures) and must use the GL_CLAMP_TO_EDGE for texture wrapping in both dimensions; this is supported by default in OpenGL ES 2.0 and under ES 1.1 by the extension GL_APPLE_texture_2D_limited_npot
For ES 1.1, you can check at runtime to see if this extension is present with this code:
const char* extensions = (char*) glGetString(GL_EXTENSIONS);
bool npot = strstr(extensions, "GL_APPLE_texture_2D_limited_npot") != 0;
Since this is only present on the SGX and not the MBX, be aware that relying on NPOT texture support will limit you to the newer SGX devices. Of course, relying on ES 2.0 will do the same, so if that's your intended target, NPOT support is a moot point and you can go ahead with NPOT textures.
Here's an alternate solution that lets you keep using ES 1.1 and retain full device support. Put the 320x480 texture inside a 512x512, fill the whitespace with other background tiles, glyphs, or other textures that will be drawn at the same time (to avoid multiple glBindTexture calls) and then use one of my favourite ES 1.1 extensions, GL_OES_draw_texture, to quickly copy the 320x480 section onto the viewport:
int rect[4] = {0, 0, 480, 320};
glBindTexture(GL_TEXTURE_2D, texBackground);
glTexParameteriv(GL_TEXTURE_2D, GL_TEXTURE_CROP_RECT_OES, rect);
glDrawTexiOES(0, 0, z, 480, 320);
Sidebar: The OpenGL ES 2.0 spec itself doesn't specify any restrictions on NPOT textures; unless I'm mistaken, Apple is imposing the limitations - of course, in the ES 1.1 world, NPOT support doesn't exist at all, so it's an addition there.
Assuming you don't have too many full-screen textures, you could just use a 512x512 texture and only use 320x480 of it. It will definitely work.
I guess that depends on the hardware. I used to create the closest power of 2 texture i.e if my texture is 320x480 then I will create a texture of 512x512 which will have the original texture data. this ensures portability but consumes a bit more memory ;)