Texture transparency brush with OpenGL ES on iPhone - iphone

I am new to OpenGL and trying to implement an eraser brush which will erase a texture with brush stroke to reveal the background of the opengl view, by adjusting the alpha value at the the brush stroke. the OpenGL view's opaque property is set to NO.
I am using the Apple's GLPaint as a starting point. The brush I am using is a texture whose center is alpha zero with radially fading alpha to 1 at the circuler edges.
I use glColorMask(0,0,0,1) to draw only on the alpha channel.
Now, the problem is with the blend function. If I use glBlendFunc(GL_DST_ALPHA, GL_SRC_ALPHA), it kind of works, but not as I expected.
I want to the resulting alpha to be the minimum of the destination alpha (the alpha already on the screen) and the alpha of the brush.
The blending function glBlendFunc(GL_DST_ALPHA, GL_SRC_ALPHA) does not work in cases like...
Lets say the screen (destination) has an alpha of 0.5. When the edge of the brush (where the brush's alpha is 1) touches this pixel, the resulting pixel's alpha should remain 0.5 (minimum of 1 and 0.5). But with the above blending function it would become (1 * 0.5 + 1 * 1), making it more opaque again.
What blending options shall I use to get a smooth erase brush? Is there any other approach I can take to solve it?

Related

Fade Object Inside Transparent Game Object

I have two cylinders in my scene. The outer cylinder (Cylinder A) has planes facing the outside and planes facing the inside to represent glass.
Cylinder A has a standard shader on it, set to transparent mode, it has an albedo Alpha of 143. It has also a shininess to it represented by it's smoothness and this is set to 1.
Cylinder B has a Standard Shader on it, set to transparent mode, It has an albedo alpha that animates from 255 to 0 over a short period of time. It has a Normal map set to 0.3 and no smoothness.
Here is an image to act as a guide.
The problem involves the Cylinder B not 'playing ball'. It should be inside Cylinder A - and would look right if Cylinder A's shininess was visible to the viewer.
I have provided a second example where Cylinder B is in Opaque mode (Standard Shader). It looks right - except it cannot animate a fade.
I have looked into this. From what I have experienced - Cutout Mode will either be Transparent or Opaque, nothing in between. Fade mode has the same effect.
I can see that this has to do with Z Ordering and that ZWriting may be the way to fix this. I am unfamiliar with it's placement within shaders or if it will work while an animation fades out the alpha of Cylinder B and wondering can anyone point me in the right direction for a further understanding.
Any help is greatly appreciated.

iOS Sprite Kit - SKSpriteNode - blend two sprites

Actually, I'm migrating a game from another platform, and I need to generate a sprite with two images.
The first image will be something like the form, a pattern or stamp, and the second is only a rectangle that sets color to the first. If the color was plane, it will be easy, I could use sprite.color and sprite.colorBlendFactor to play with it, but there are levels where the second image is a rectangle with two colors (red and green, for example).
Is there any way to implement these with Sprite Kit?
I mean, something like using Core image filter, and CIBlendWithAlphaMask, but only with Image and Mask image. (https://developer.apple.com/library/ios/documentation/graphicsimaging/Reference/CoreImageFilterReference/Reference/reference.html#//apple_ref/doc/uid/TP40004346) -> CIBlendWithAlphaMask.
Thanks.
Look into the SKCropNode class (documentation here) - it allows you to set a mask for an image underneath it.
In short, you would create two SKSpriteNodes - one with your stamp, the other with your coloured rectangle. Then:
SKCropNode *myCropNode = [SKCropNode node];
[myCropNode addChild:colouredRectangle]; // the colour to be rendered by the form/pattern
myCropNode.maskNode = stampNode; // the pattern sprite node
[self addChild:myCropNode];
Note that the results will probably be more similar to CIBlendWithMask rather than CIBlendWithAlphaMask, since the crop node will mask out any pixels below 5% transparency and render all pixels above this level, so the edges will be jagged rather than smoothly faded. Just don't use any semi-transparent areas in your mask and you'll be fine.

iPhone OpenGL ES 2.0 blending with Cocos2D gives unexpected results

I have very simple CCScene with ONLY 1 CCLayer containing:
CCSprite for background with standard blending mode
CCRenderTexture to draw paint brushes, with its sprite attached to root CCLayer above background sprite:
_bgSprite = [CCSprite spriteWithFile:backgroundPath];
_renderTexture = [CCRenderTexture renderTextureWithWidth:self.contentSize.width height:self.contentSize.height];
[_renderTexture.sprite setBlendFunc:(ccBlendFunc){GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA}];
[self addChild:_bgSprite z:-100];
[self addChild:_renderTexture];
Brush rendering code:
[_renderTexture begin];
glBlendFuncSeparate(GL_ONE, GL_ZERO, GL_ONE, GL_ONE); // 1.
// calculate vertices code,etc...
glDrawArrays(GL_TRIANGLES, 0, (GLsizei)count);
[_renderTexture end];
When user brushes with first colored brush, it blends with background as expected.
But when when continues brushing with another color on top of the previous brush, it goes wrong (soft alpha edges loses opacity when 2 brushes overlap each other):
I tried many blending options but somehow I cannot find correct one.
Is there something special about CCRenderTexture that it does not blend with itself (with previously drawn content) as expected?
My fragment shader used for brushing is just standard texture shader with minor change to preserve input color alpha in texture:
void main()
{
gl_FragColor = texture2D(u_texture, v_texCoord);
gl_FragColor.a = v_fragmentColor.a;
}
UPDATE - ALMOST PERFECT SOLUTION : by jozxyqk
glBlendFuncSeparate(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA,
GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
in rendering code (in place of // 1. and
[_renderTexture.sprite setBlendFunc:(ccBlendFunc){GL_ONE, GL_ONE_MINUS_SRC_ALPHA}];
THIS WORKS GREAT AND GIVES ME WHAT I WANT...
...BUT ONLY WHEN _rederTexture is in full opacity.
When opacity of _rendertexture.sprite is lowered, brushes get lightened up instead of fading out as one could expect:
Why alphas of the brushes are blending with background correctly when parent texture is in full opacity but go bananas when opacity is lowered? How can I make brushes to blend with background correctly?
EDIT
Blending brush -> layer -> background
OK, what's happening is glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA) is working for blending the brush strokes into the brush texture, but the resulting alpha values in the texture are wrong. Each added fragment needs to 1. add it's alpha to the final alpha value - it has to remove exactly that much light for the interaction and 2. scale the previous alpha by the remainder - previous surfaces reduce the light by the previous value, but since a new surface is added there is less light for them to reduce. I'm not sure if that made sense but it leads to this...
glBlendFuncSeparate(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA,
GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
Now the colour channel of the brush texture contains the total colour to be blended with the background (pre-multiplied with alpha) and the alpha channel gives the weight (or the amount the colour obscures the background). Since the colour is pre-multiplied with alpha, the default RenderTexture blending GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA scales with alpha again and hence darkens the overall colour. You now need to blend the brush texture with the background using the following function, which I gather must be set in Cocos2D:
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
Hopefully this is possible. I haven't given a lot of thought on how to manage the possibility of setting up the brush texture to blend with GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA but it may require a floating point texture and/or an extra pass to divide/normalize the alpha, which sounds painful.
Alternatively, splat the background into your render texture before drawing and keep the lot there without any blending of layers.
This worked for me:
glDisable(GL_DEPTH_TEST);
glClear(GL_COLOR_BUFFER_BIT);
glEnable(GL_TEXTURE_2D);
glEnable(GL_BLEND);
fbo.bind();
glClear(GL_COLOR_BUFFER_BIT);
glBlendFuncSeparate(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA,
GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
drawTexture(brush1);
drawTexture(brush2);
fbo.unbind();
drawTexture(grassTex); //tex alpha is 1.0, so blending doesn't affect background
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
drawTexture(fbo.getColour(0)); //blend in the brush layer
Brush layer opacity
Using GL_ONE, GL_ONE_MINUS_SRC_ALPHA causes issues with the library's implementation of opacity in layer blending since it assumes the colour is multiplied by alpha. By reducing the opacity value, the alpha of the brush layer is scaled down during blending. GL_ONE_MINUS_SRC_ALPHA then causes the amount of background colour to increase, however GL_ONE sums 100% of the brush layer and oversaturates the image.
The simplest solution imo is to find a way to scale down the colour by the global layer opacity yourself and continue to use GL_ONE, GL_ONE_MINUS_SRC_ALPHA.
Actually using GL_CONSTANT_COLOR, GL_ONE_MINUS_SRC_ALPHA might be an answer if the library supported it, but apparently it doesn't.
You could use fixed pipeline rendering to scale the colour: glColor4f(opacity, opacity, opacity, opacity), but this will require a second render target and doing the blend manually, similarly to the code above, where you draw a full screen quad once for the background and again for the brush layer.
If you're doing the blend manually it would be more robust to use a fragment shader instead of the glColor method. This would allow far greater control if you ever wanted to play with more complex blending functions, especially where divisions and temporaries outside the 0 to 1 range are concerned:
gl_FragColour = texture(brushTexture, coord) * layerOpacity;
END EDIT
The standard alpha blending function is glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);, not quite the GL "initial"/default function.
Summing alpha values as you do in glBlendFuncSeparate will oversaturate alpha and the underneath colour is completely replaced. Saturation blending may give decent results: glBlendFunc(GL_SRC_ALPHA_SATURATE, GL_ONE). It might also be worth experimenting with glBlendEquationSeparate and MAX blending, if it's supported. The advantage of playing with MAX would be reducing the overlapping artefacts (hard triangular bits) from your line drawing code - eg replace colour, but only until total alpha value X is reached. EDIT: Both cases will require blending and clearing after each stroke.
I can only assume blending the render texture onto the background is in fact working. (not for the current layer values)
On a side note and largely unrelated there's also "Under Blending", where you keep a transmittance value instead of alpha/opacity (from here):
glBlendEquation(GL_FUNC_ADD);
glBlendFuncSeparate(GL_DST_ALPHA, GL_ONE, GL_ZERO, GL_ONE_MINUS_SRC_ALPHA);

Transparent Texture With OpenGL-ES 2.0

I am trying to add a transparent texture on top a cube. Only the front face is not transparent. Other sides are transparent. What could be the problem?. Any help is appreciated.
EDIT : I found that the face which is drawn first is opaque.
3 face of the cube is drawn.
Opaque face.((This face's index is given first in GLdrawElements))
Transparent face.
You most probably ran into a sorting problem. To display transparent geometries correctly the faces of the object have to be sorted from back to front.
Unfortunately there is no built-in support for that in opengl-es (or in any gfx-library in existance). The only possibility is to sort your polygons, recreate your object each frame and draw it with correctly ordered faces.
A workaround would be using additive transparency instead of normal transparency. Additive transparency is an order independent calculation. You have to remember to turn off z-buffer writes while drawing because otherwise some geometry may be ocluded.
Additive transparency is achieved by setting both blendfunc values to GL_ONE.

Change texture opacity in OpenGL

This is hopefully a simple question: I have an OpenGL texture and would like to be able to change its opacity, how do I do that? The texture already has an alpha channel and blending works fine, but I want to be able to decrease the opacity of the whole texture, to fade it into the background. I have fiddled with glBlendFunc, but with no luck – it seems that I would need something like GL_SRC_ALPHA_MINUS_CONSTANT, which is not available. I am working on iPhone, with OpenGL ES.
I have no idea about OpenGL ES, but in standard OpenGL you would set the opacity by declaring a colour for the texture before you use it:
// R, G, B, A
glColor4f(1.0, 1.0, 1.0, 0.5);
The example would give you 50% alpha without affecting the colour of your texture. By adjusting the other values you can shift the texture colour too.
Use a texture combiner. Set the texture stage to do a GL_MODULATE operation between a texture and constant color. Then change the constant color from your code (glTexEnv, GL_TEXTURE_ENV_COLOR).
This should come as "free" in terms of performance. On most (if not all) graphics chips combiner operations take the same number of GPU cycles (usually 1), so just using a texture versus doing a modulate operation (or any other operation) is exactly the same cost.
Basically you have two options: use glTexEnv for your texture with GL_MODULATE and specify the color using glColor4* and use a non-opaque level for the alpha channel. Note that glTexEnv should be issued only once, when you first load your texture. This scenario will not work if you specify colors in your vertex-attributes though. Those will namely override any glColor4* color you may set. In that case, you may want to resort to either of 2 options: use texture combiners (advanced topic, not nice to use in fixed-pipeline), or "manually" change the vertex color attribute of each individual vertex (can be undesirable for larger meshes).
If you are using modern OpenGL..
You can do this in the fragment shader :
void main()
{
color = vec4(1.0f, 1.0f, 1.0f, OPACITY) * texture(u_Texture, TexCoord);
}
This allows you to apply a opacity value to a texture without disrupting the blending.
Thank You all for the ideas. I’ve played both with glColor4f and glTexEnv and at last forced myself to read the glTexEnv manpage carefully. The manpage says that in the GL_MODULATE texturing mode, the resulting color is computed by multiplying the incoming fragment by the texturing color (C=Cf×Ct), same goes for the alpha. I tried glColor4f(1, 1, 1, opacity) and that did not work, but passing the desired opacity into all four arguments of the call did the trick. (Still not sure why though.)
The most straightforward way is to change the texture's alpha value on the fly. Since you tell OpenGL about the texture at some point, you will have the bitmap in memory. So just rebind the texture to the same texture id. In case you don't have it in memory, (due to space constraints, since you are on ES), you can retrieve the texture to a buffer again, using glGetTexImage(). That's the clean solution.
Saving/retrieving operations are a bit costly, though, so you might want another solution. Thinking about it, you might be able to work with geometry behind your the geometry displaying your texture or simply work on the material/colour of the geometry that holds the texture. You will probably want to have some additive blending of the back-geometry. Using a glBlendFunc of
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_DST_ALPHA),
you might be able to "easily" and - more important, cheaply - achieve the desired effect.
Most likely you are using cg to get your image into a texture. When you use cg, the alpha is premultiplied, thus why you have to use the alpha for rgba of the color4f func.
I suspect that you had a black background, and thus by decreasing the amount of every color, you were effectively fading the color to black.