iPhone OpenGL ES 2.0 blending with Cocos2D gives unexpected results - iphone

I have very simple CCScene with ONLY 1 CCLayer containing:
CCSprite for background with standard blending mode
CCRenderTexture to draw paint brushes, with its sprite attached to root CCLayer above background sprite:
_bgSprite = [CCSprite spriteWithFile:backgroundPath];
_renderTexture = [CCRenderTexture renderTextureWithWidth:self.contentSize.width height:self.contentSize.height];
[_renderTexture.sprite setBlendFunc:(ccBlendFunc){GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA}];
[self addChild:_bgSprite z:-100];
[self addChild:_renderTexture];
Brush rendering code:
[_renderTexture begin];
glBlendFuncSeparate(GL_ONE, GL_ZERO, GL_ONE, GL_ONE); // 1.
// calculate vertices code,etc...
glDrawArrays(GL_TRIANGLES, 0, (GLsizei)count);
[_renderTexture end];
When user brushes with first colored brush, it blends with background as expected.
But when when continues brushing with another color on top of the previous brush, it goes wrong (soft alpha edges loses opacity when 2 brushes overlap each other):
I tried many blending options but somehow I cannot find correct one.
Is there something special about CCRenderTexture that it does not blend with itself (with previously drawn content) as expected?
My fragment shader used for brushing is just standard texture shader with minor change to preserve input color alpha in texture:
void main()
{
gl_FragColor = texture2D(u_texture, v_texCoord);
gl_FragColor.a = v_fragmentColor.a;
}
UPDATE - ALMOST PERFECT SOLUTION : by jozxyqk
glBlendFuncSeparate(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA,
GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
in rendering code (in place of // 1. and
[_renderTexture.sprite setBlendFunc:(ccBlendFunc){GL_ONE, GL_ONE_MINUS_SRC_ALPHA}];
THIS WORKS GREAT AND GIVES ME WHAT I WANT...
...BUT ONLY WHEN _rederTexture is in full opacity.
When opacity of _rendertexture.sprite is lowered, brushes get lightened up instead of fading out as one could expect:
Why alphas of the brushes are blending with background correctly when parent texture is in full opacity but go bananas when opacity is lowered? How can I make brushes to blend with background correctly?

EDIT
Blending brush -> layer -> background
OK, what's happening is glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA) is working for blending the brush strokes into the brush texture, but the resulting alpha values in the texture are wrong. Each added fragment needs to 1. add it's alpha to the final alpha value - it has to remove exactly that much light for the interaction and 2. scale the previous alpha by the remainder - previous surfaces reduce the light by the previous value, but since a new surface is added there is less light for them to reduce. I'm not sure if that made sense but it leads to this...
glBlendFuncSeparate(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA,
GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
Now the colour channel of the brush texture contains the total colour to be blended with the background (pre-multiplied with alpha) and the alpha channel gives the weight (or the amount the colour obscures the background). Since the colour is pre-multiplied with alpha, the default RenderTexture blending GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA scales with alpha again and hence darkens the overall colour. You now need to blend the brush texture with the background using the following function, which I gather must be set in Cocos2D:
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
Hopefully this is possible. I haven't given a lot of thought on how to manage the possibility of setting up the brush texture to blend with GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA but it may require a floating point texture and/or an extra pass to divide/normalize the alpha, which sounds painful.
Alternatively, splat the background into your render texture before drawing and keep the lot there without any blending of layers.
This worked for me:
glDisable(GL_DEPTH_TEST);
glClear(GL_COLOR_BUFFER_BIT);
glEnable(GL_TEXTURE_2D);
glEnable(GL_BLEND);
fbo.bind();
glClear(GL_COLOR_BUFFER_BIT);
glBlendFuncSeparate(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA,
GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
drawTexture(brush1);
drawTexture(brush2);
fbo.unbind();
drawTexture(grassTex); //tex alpha is 1.0, so blending doesn't affect background
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
drawTexture(fbo.getColour(0)); //blend in the brush layer
Brush layer opacity
Using GL_ONE, GL_ONE_MINUS_SRC_ALPHA causes issues with the library's implementation of opacity in layer blending since it assumes the colour is multiplied by alpha. By reducing the opacity value, the alpha of the brush layer is scaled down during blending. GL_ONE_MINUS_SRC_ALPHA then causes the amount of background colour to increase, however GL_ONE sums 100% of the brush layer and oversaturates the image.
The simplest solution imo is to find a way to scale down the colour by the global layer opacity yourself and continue to use GL_ONE, GL_ONE_MINUS_SRC_ALPHA.
Actually using GL_CONSTANT_COLOR, GL_ONE_MINUS_SRC_ALPHA might be an answer if the library supported it, but apparently it doesn't.
You could use fixed pipeline rendering to scale the colour: glColor4f(opacity, opacity, opacity, opacity), but this will require a second render target and doing the blend manually, similarly to the code above, where you draw a full screen quad once for the background and again for the brush layer.
If you're doing the blend manually it would be more robust to use a fragment shader instead of the glColor method. This would allow far greater control if you ever wanted to play with more complex blending functions, especially where divisions and temporaries outside the 0 to 1 range are concerned:
gl_FragColour = texture(brushTexture, coord) * layerOpacity;
END EDIT
The standard alpha blending function is glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);, not quite the GL "initial"/default function.
Summing alpha values as you do in glBlendFuncSeparate will oversaturate alpha and the underneath colour is completely replaced. Saturation blending may give decent results: glBlendFunc(GL_SRC_ALPHA_SATURATE, GL_ONE). It might also be worth experimenting with glBlendEquationSeparate and MAX blending, if it's supported. The advantage of playing with MAX would be reducing the overlapping artefacts (hard triangular bits) from your line drawing code - eg replace colour, but only until total alpha value X is reached. EDIT: Both cases will require blending and clearing after each stroke.
I can only assume blending the render texture onto the background is in fact working. (not for the current layer values)
On a side note and largely unrelated there's also "Under Blending", where you keep a transmittance value instead of alpha/opacity (from here):
glBlendEquation(GL_FUNC_ADD);
glBlendFuncSeparate(GL_DST_ALPHA, GL_ONE, GL_ZERO, GL_ONE_MINUS_SRC_ALPHA);

Related

Texture transparency brush with OpenGL ES on iPhone

I am new to OpenGL and trying to implement an eraser brush which will erase a texture with brush stroke to reveal the background of the opengl view, by adjusting the alpha value at the the brush stroke. the OpenGL view's opaque property is set to NO.
I am using the Apple's GLPaint as a starting point. The brush I am using is a texture whose center is alpha zero with radially fading alpha to 1 at the circuler edges.
I use glColorMask(0,0,0,1) to draw only on the alpha channel.
Now, the problem is with the blend function. If I use glBlendFunc(GL_DST_ALPHA, GL_SRC_ALPHA), it kind of works, but not as I expected.
I want to the resulting alpha to be the minimum of the destination alpha (the alpha already on the screen) and the alpha of the brush.
The blending function glBlendFunc(GL_DST_ALPHA, GL_SRC_ALPHA) does not work in cases like...
Lets say the screen (destination) has an alpha of 0.5. When the edge of the brush (where the brush's alpha is 1) touches this pixel, the resulting pixel's alpha should remain 0.5 (minimum of 1 and 0.5). But with the above blending function it would become (1 * 0.5 + 1 * 1), making it more opaque again.
What blending options shall I use to get a smooth erase brush? Is there any other approach I can take to solve it?

OpenGL texture blending problems

I'm creating a 2d application for the iPad using OpenGL ES and having some issue drawing transparent images.
I'm using png-24 images with full transparency. I'm also changing the color of some of some textures, which are white with some areas transparent or semi-transparent. That all works fine.
When I try to set the alpha value of one of these textures, however, it's not working quite right. The colors are much too saturated, and if the alpha value = 0, i'm left with a white rather than transparent image over a light grey background. When such a transparent image is over a dark image, the dark image becomes a color similar to color of the transparent image.
I've tried a many parameter combinations of the glTexEnvi and glBlendFunc with no success.
I'm not very knowledgable about OpenGL, so if anyone has any suggestions, that would be great. Let me know if there are any details that would help.
Thanks.
Here is the initialization of OpenGL
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
glClearColor(1.0f, 1.0f, 1.0f, 1.0f);
glDisable(GL_DEPTH_TEST);
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_COLOR_ARRAY);
glEnable(GL_TEXTURE_2D);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
glEnable(GL_BLEND);
Sounds like you told OpenGL your texture had premultiplied alpha, but it actually doesn't.
What parameters are you using for glBlendFunc?
More explanation of pre-multiplied alpha
Thanks for everyone's input. I think I've found a solution.
The problem was that the textures had premultiplied alpha but the polygons the textures are being applied to did not. With glBlendFunc(GL_ONE,GL_ONE_MINUS_SRC_ALPHA); and glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE); in place all I had to do was manually multiply the R, G, and B color values of the polygons by the alpha value so that everything is essentially using premultiplied alpha and renders consistently.
I'm not sure if this is the most efficient solution, but it seems to work perfectly so far and it was just a matter of adding a few lines of code to my image rendering class to update the color values whenever the opacity is changed.
wooo
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_ALPHA);
This is not valid, are you checking GL errors? Check the glTexEnv man page to see the valid values:
http://www.opengl.org/sdk/docs/man/xhtml/glTexEnv.xml
GL_ALPHA is not a texture function, the valid texture functions are GL_ADD, GL_MODULATE, GL_DECAL, GL_BLEND, GL_REPLACE, and GL_COMBINE.
You might want to replace that GL_ALPHA with GL_MODULATE so you can control the texture alpha with the vertex color (glColor). This mode multiplies the texture with the vertex color.
Chances are if you're using a 24-bit PNG, you wont have the alpha channel. Try converting to a 32-bit PNG and editing the alpha channel. Alternatively you can convert from 24bit to 32bit yourself and generate the alpha channel from the image data (eg: by chroma-keying).
Then use glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); or glBlendFunc(GL_SRC_ALPHA, GL_ONE); if you want additive blending.

Translucent sprite in opengl es 2.0 using shader

I'm trying to create an shader that does the same thing as glcolor4f and then the alpha part of it. In opengl es 1.1 if you set the alpha to say 0.5 the sprite would be half translucent.
Now i can't seem to get the effect using an shader, this is how my shader looks like now:
gl_FragColor = texture2d(texture, coord) * blend;
And using this blend mode:
glEnable(GL_BLEND);
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
But that doesn't work, it does change the color of an sprite but not the translucency. What am i missing?
Thanks for your time,
Richard.
It seems that you are scaling the color you get from the texture by the blend factor, which is not how alpha is performed (this would just make it darker).
I believe you need something along the lines of the following
gl_FragColor = vec4(texture2d(texture, coord).rgb, blend);
See if that works

Change texture opacity in OpenGL

This is hopefully a simple question: I have an OpenGL texture and would like to be able to change its opacity, how do I do that? The texture already has an alpha channel and blending works fine, but I want to be able to decrease the opacity of the whole texture, to fade it into the background. I have fiddled with glBlendFunc, but with no luck – it seems that I would need something like GL_SRC_ALPHA_MINUS_CONSTANT, which is not available. I am working on iPhone, with OpenGL ES.
I have no idea about OpenGL ES, but in standard OpenGL you would set the opacity by declaring a colour for the texture before you use it:
// R, G, B, A
glColor4f(1.0, 1.0, 1.0, 0.5);
The example would give you 50% alpha without affecting the colour of your texture. By adjusting the other values you can shift the texture colour too.
Use a texture combiner. Set the texture stage to do a GL_MODULATE operation between a texture and constant color. Then change the constant color from your code (glTexEnv, GL_TEXTURE_ENV_COLOR).
This should come as "free" in terms of performance. On most (if not all) graphics chips combiner operations take the same number of GPU cycles (usually 1), so just using a texture versus doing a modulate operation (or any other operation) is exactly the same cost.
Basically you have two options: use glTexEnv for your texture with GL_MODULATE and specify the color using glColor4* and use a non-opaque level for the alpha channel. Note that glTexEnv should be issued only once, when you first load your texture. This scenario will not work if you specify colors in your vertex-attributes though. Those will namely override any glColor4* color you may set. In that case, you may want to resort to either of 2 options: use texture combiners (advanced topic, not nice to use in fixed-pipeline), or "manually" change the vertex color attribute of each individual vertex (can be undesirable for larger meshes).
If you are using modern OpenGL..
You can do this in the fragment shader :
void main()
{
color = vec4(1.0f, 1.0f, 1.0f, OPACITY) * texture(u_Texture, TexCoord);
}
This allows you to apply a opacity value to a texture without disrupting the blending.
Thank You all for the ideas. I’ve played both with glColor4f and glTexEnv and at last forced myself to read the glTexEnv manpage carefully. The manpage says that in the GL_MODULATE texturing mode, the resulting color is computed by multiplying the incoming fragment by the texturing color (C=Cf×Ct), same goes for the alpha. I tried glColor4f(1, 1, 1, opacity) and that did not work, but passing the desired opacity into all four arguments of the call did the trick. (Still not sure why though.)
The most straightforward way is to change the texture's alpha value on the fly. Since you tell OpenGL about the texture at some point, you will have the bitmap in memory. So just rebind the texture to the same texture id. In case you don't have it in memory, (due to space constraints, since you are on ES), you can retrieve the texture to a buffer again, using glGetTexImage(). That's the clean solution.
Saving/retrieving operations are a bit costly, though, so you might want another solution. Thinking about it, you might be able to work with geometry behind your the geometry displaying your texture or simply work on the material/colour of the geometry that holds the texture. You will probably want to have some additive blending of the back-geometry. Using a glBlendFunc of
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_DST_ALPHA),
you might be able to "easily" and - more important, cheaply - achieve the desired effect.
Most likely you are using cg to get your image into a texture. When you use cg, the alpha is premultiplied, thus why you have to use the alpha for rgba of the color4f func.
I suspect that you had a black background, and thus by decreasing the amount of every color, you were effectively fading the color to black.

Interpolating two textures by a third texture (factor is in lightness, not alpha)

How can I efficently interpolate per-pixel two textures A and B by a dynamic texture C and draw them on a simple quad? Multi-pass algorithms accepted.
I've had moderate success calculating the C texture per-frame on the CPU and uploading it with glTexImage2D into an alpha-only texture.
While this worked, performance was lacking and I had to reduce the dimensions of C to half of the full size to get around the copying bandwidth bottleneck.
So, for performance reasons, I'm trying to do all of my C texture updates using render-to-texture.
I was able to set up the necessary buffers for rendering, but fundamentally, I get a texture in RGB or RGBA format with the mask encoded in lightness/RGB information, not alpha.
How do I convert this efficiently into the alpha texture I need to plug into the texturing pipeline? Keep in mind that there is no programmable pipeline (shaders) and only two texture units available on the iPhone.
Update:
A and B are RGB-only textures, ie no alpha.
Given that textures A and B are RGB images, then perhaps you can make one of them into an RGBA image, and render the mask in the alpha channel of one image. This gets you within the iPhone's limit of two texture units, allowing you to do this in one pass, without blending.
GLSL pseudocode:
vec4 a = texture2D(textureA, texcoord);
vec4 b = texture2D(textureB, texcoord);
gl_FragColor = vec4(a.rgb * a.a + b.rgb * (1-a.a), dont_care.a);
Texture Environment for Unit 0: (samples RGB image B, and passes it on to the next stage)
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, textureB);
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
Texture Environment for Unit 1: (image B are available in Cp and 'Ap' source. A is available in 'Cs'. Mask is available in 'As'. see Table 3.15 in the GL spec).
glActiveTexture(GL_TEXTURE1);
glBindTexture(GL_TEXTURE_2D, textureA);
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_DECAL);
Read section 3.7.12 Texture Environments and Texture Functions in the OpenGL ES 1.1 spec for full information.
To actually render your scene into the alpha channel of the image, it may be helpful to use glColorMask(), to allow writing to only the alpha channel. How you actually get the data into that channel really depends on exactly what you're drawing to generate that mask.