iPhone OpenGL ES Texture2D Masking - iphone

What's the best choice when trying to mask a texture
like ColorSplash or other apps like iSteam, etc?
I started learning OPENGL ES like... 4 days ago (I'm a total
rookie) and tried the following approach:
1) I created a colored texture2D, a grayscale version of the first
texture and a third texture2D called mask
2) I also created a texture2D for the brush... which is grayscale and
it's opaque (brush = black = 0,0,0,1 and surroundings = white =
1,1,1,1). My intention was to create an antialiased brush with smooth
edges but i'm fine with a normal one right now
3) I searched for masking techniques on the internet and found this
tutorial ZeusCMD - Design and Development Tutorials : OpenGL ES Programming Tutorials - Masking
about masking. The tutorial tells me to use blending to achieve
masking... first draw colored, then mask with
glBlendFunc(GL_DST_COLOR, GL_ZERO) and then grayscale with
glBlendFunc(GL_ONE, GL_ONE) ... and this gives me something close to
what i want... but not exactly what i want. The result is masked but
it's somehow overbright-ed
4) For drawing to the mask texture i used an extra frame buffer object (FBO)
I'm not really happy with the resulting image (overbright-ed picture)
nor with the speed achieved with this method. I think the normal way
was to draw directly to the grayscale (overlay) texture2D affecting
only it's alpha channel in the places where the brush hits. Is there a
fast way to achieve this? I have searched a lot and never got an
answer that's clear and understandable. Then, in the main draw loop I
could only draw the colored texture and then blend the grayscale ontop
with glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA).
I just want to learn to use OPENGL ES and it's driving me nuts because i can't get it to work properly. An advice, a link to a tutorial would be much appreciated.

For something that will actually work on the iPhone, try texture combiners.
I used them to mask an RGBA texture against another, transformed alpha texture.
This was for generating a complicated shadow in the absence of a stencil buffer,
but your situation doesn't seem so different.
Note that this link explains combiners in terms of fragment shaders, which works well.
Unfortunately the combiners are more complicated than their shader counterparts.

Related

OpenGL, primitives with opacity without visible overlap

I'm trying to draw semi-transparent primitives (lines, circles) in OpenGL ES using Cocos2d but can't avoid the visible overlap regions. Does anyone know how to solve this?
This is a problem you usually come across pretty often, even in 3D.
I'm not too familiar with Cocos2D, but one way to solve this in generic OpenGL is to fill the framebuffer with your desired alpha channel, switch the blending mode to glBlendFunc(GL_ONE_MINUS_DST_ALPHA, GL_DST_ALPHA) and draw the rectangles. The idea behind this is that you draw a rectangle with the desired transparency which is taken from the framebuffer, but in the progress mask the area you've drawn to so that your subsequent rectangles will be masked there.
Another approach is to render the whole thing to a texture or assemble the shape using polygons that don't overlap.
I'm not sure whether Cocos2D supports any of theseā€¦
I do not know what capabilities Cocos2D specifically provides, but I can see two options,
One, do not overlap like that, but rather construct more complex geometry such that each pixel is only covered once,
Two, use stencil buffer to create a mask as you draw, and to reject any pixels that are already masked.

OpenGL ES for iPhone blending not working

I'm a beginner to 3D graphics in general and I'm trying to make a 3D game for the iPhone, and more specifically, to use textures that contain transparency. I am able to load a texture (an 8 bit .png file) into OpenGL and map it to a square (made from a triangle strip) but the transparent parts of the image are not transparent when I run the app in the simulator - they take on the background colour, whatever it is set to, but obscure images that are further away. I am unable to post a screenshot as I am a new user, so my apologies for that. I will try to upload and link it some other way.
Even more annoying is that when I load the image into Apple's GLSprite example code, it works exactly as I want it to. I have copied the code from GLSprite's setupView into my project and it still doesn't work properly.
I am using the blend function:
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
I was under the impression that this is correct for what I want to do.
Is there something very basic I am missing here? Any help would be much appreciated as I am submitting this as a coursework project in a few weeks and would very much like it to work.
Let me break this down:
First of all your transparent object is drawn.
At this point two things happen:
The pixels are drawn correctly to the back buffer
The depth buffer pixels are set in the depth buffer. Note that the depth buffer will write values all across your object, and transparency does not affect it.
You then draw other objects behind the transparent object.
But any of these objects pixels will not be drawn, because their depth buffer value are less than those already drawn.
The solution to this problem is to draw your scene back-to-front (draw starting at the further away things).
Hope that helps.
Edit: I'm assuming you are using the depth buffer here. If this isn't correct I'll consider writing another answer.

Eraser in OpenGL ES iphone

i have two images which are overlapping on each other.(the way in which cards are placed on top of each other)
now if i move my finger over the top most image that portion of the image should become transparent.(opacity of that part should become 0).
i am new to OpenGL ES development.
kindly help me out or give me any suggestion to complete this functionality.
Thanks in advance
You're going to need render-to-texture using Framebuffer Objects (FBOs). Render to the desired texture, but only to the alpha channel, which is done using glColorMask (With it you can mask all color channels except alpha), and then draw the pattern into the alpha channel, setting alpha to 0.0, then display the textures as normal.
I just did something similar, and I found a solution in GLBlending:
if (eraseMode) {
glBlendFunc(1.0,0.0);
}
else {
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
}
Some cosmetics are needed around this, but it's a simple solution that fits basic needs.

How to do this FadeOut Effect with openGL on iPhone?

I'm playing around with the GLPaint Example from Apple.
But I don't know how to create an effect which fades the already drawn stuff out.
I created an example in Flash which shows the effect I'm looking for:
http://staging.rwichmann.com/openglexample/
In Flash I'm drawing a texture on a BitmapData and in every frame I'm adding a ColorTransform to the BitmapData which fades out the old drawn data.
I guess there must be a similar solution in openGL. Something with the renderBuffer or frameBuffer but I didn't find any solution.
Do you have an idea, tip, hint?
Just a suggestion. Not really that code related, it's art related. Create a long trailing art with diminishing alpha values until at the end of the art the graphic is transparent. Rotate and scale as necessary to match the turns and directions of the lead object. This can be overly simplified but will also work with Quartz. If you go 3D you have to consider the rendering direction to the camera and apply a similar series of fading alpha textures.
sorry, no real code to show you.
Best regards,
Natchaphon

antialiasing iPhone OpenGLES

I need in antialiasing in iPhone 3G (OpenGL ES1.1), NOT iPhone 3Gs with OpenGL ES.2.0.
I've draw 3d model and have next: pixels on the edges of the model look like teeth.
I've try set any filters for texture, but this filters making ONLY texture INSIDE look better.
How can i make good antialising ?
May be i should use any smooth for drawing triangles ? If yes, then how it possible in OpenGL ES1.1 ?
thanks.
As of iOS 4.0, full-screen anti-aliasing is directly supported via an Apple extension to OpenGL. The basic concept is similar to epatel's suggestion: render the scene onto a larger framebuffer, then copy that down to a screen-sized framebuffer, then copy that buffer to the screen. The difference is, instead of creating a texture and rendering it onto a quad, the copy/sample operation is performed by a single function call (specifically, glResolveMultisampleFramebufferAPPLE()).
For details on how to set up the buffers and modify your drawing code, you can read a tutorial on the Gando Games blog which is written for OpenGL ES 1.1; there is also a note on Apple's Developer Forums explaining the same thing.
Thanks to Bersaelor for pointing this out in another SO question.
You can render into a larger FBO and then use that as a texture on a square.
Have a look at this article for an explanation.
Check out the EGL_SAMPLE_BUFFERS and EGL_SAMPLES parameters to eglChooseConfig(), as well as glEnable(GL_MULTISAMPLE).
EDIT: Hrm, apparently you're out of luck, at least as far as standardized approaches go. As mentioned in that thread you can render to a large off-screen texture and scale to a smaller on-screen quad or jitter the view matrix several times.
We found another way to achieve this. If you edit your textures and add for example a 2 pixel frame of transparent pixels, the colored pixels in the texture are blended with the transparent pixels when necessary giving a basic anti-aliasing effect. You can read the full article here in our blog.
The advantage of this approach is that you are not rendering a bigger image, or copying a buffer, or even worse, making a texture from a buffer, so there is no impact in performance.