How to use multiple textures in OpenGL ES 2.0 - iphone

I would like to set up a texture array to use for my shape. I have researched this topic on the internet but there are hardly any references a newbie like me can make use of.
Again, what I am trying to achieve, is a texture array that I can use to map different textures onto the different faces of my shape.
As of now I have got only one texture that I generate from a UIView.
My core questions are:
How do I set up this array ?
How do I load textures into that array ?
How do I use this array ?
Here is my code:
- (void)setupGL {
[EAGLContext setCurrentContext:self.myContext];
self.effect = [[GLKBaseEffect alloc] init];
self.layer.contentsScale = 2.0;
BOOL useTexture = YES;
// Create default framebuffer object.
glGenFramebuffers(1, &defaultFrameBuffer);
glBindFramebuffer(GL_FRAMEBUFFER, defaultFrameBuffer);
myView = [[MyView alloc]initWithFrame:CGRectMake(0,0,320,320)];
self.effect.transform.projectionMatrix = GLKMatrix4MakePerspective(45.0f,0.9f, 0.01f, .08f);
self.effect.transform.projectionMatrix = GLKMatrix4Translate(self.effect.transform.projectionMatrix, 0, 0.1, 1.2);
rotMatrix = GLKMatrix4Translate(self.effect.transform.modelviewMatrix,0,0,-2);
self.effect.transform.modelviewMatrix = rotMatrix;
/*********************
MAPPING UIVIEW ONTO THE FACE
****************/
CGColorSpaceRef colourSpace = CGColorSpaceCreateDeviceRGB();
self.effect.texture2d0.enabled = true;
GLubyte *pixelBuffer = (GLubyte *)malloc(
4 *
myView.bounds.size.width * coordToPixScale *
myView.bounds.size.height * coordToPixScale);
CGContextRef context =
CGBitmapContextCreate(pixelBuffer,
myView.bounds.size.width*coordToPixScale, myView.bounds.size.height*coordToPixScale,
8, 4*myView.bounds.size.width *coordToPixScale,
colourSpace,
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colourSpace);
// draw the view to the buffer
[myView.layer renderInContext:context];
// upload to OpenGL
glTexImage2D(GL_TEXTURE_2D, 0,
GL_RGBA,
myView.bounds.size.width * coordToPixScale, myView.bounds.size.height * coordToPixScale, 0,
GL_RGBA, GL_UNSIGNED_BYTE, pixelBuffer);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
// clean up
CGContextRelease(context);
glGenBuffers(1, &texArray);
glBindBuffer(GL_ARRAY_BUFFER, texArray);
glEnableVertexAttribArray(GLKVertexAttribTexCoord0);
glBufferData(GL_ARRAY_BUFFER, sizeof(TexCoords), TexCoords, GL_STATIC_DRAW);
glVertexAttribPointer(GLKVertexAttribTexCoord0, 2, GL_FLOAT, GL_FALSE, 0,0);
/**************************
******************************************/
free(pixelBuffer);
glGenRenderbuffers(1, &depthBuffer);
glBindRenderbuffer(GL_RENDERBUFFER, depthBuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, myView.bounds.size.width * coordToPixScale, myView.bounds.size.height * coordToPixScale);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthBuffer);
glEnable(GL_DEPTH_TEST);
glGenBuffers(1, &vertexArray);
glBindBuffer(GL_ARRAY_BUFFER, vertexArray);
glEnableVertexAttribArray(GLKVertexAttribPosition);
glBufferData(GL_ARRAY_BUFFER, sizeof(Vertices), Vertices, GL_STATIC_DRAW);
glVertexAttribPointer(GLKVertexAttribPosition,3,GL_FLOAT,GL_FALSE,0,0);
}
This method is called when it gets drawn:
- (void)glkView:(GLKView *)view drawInRect:(CGRect)rect {
self.opaque = NO;
glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
[self.effect prepareToDraw];
glDrawArrays(GL_TRIANGLES, 0, sizeof(Vertices) / (sizeof(GLfloat) * 3));
}

Seeing as you are using GLKit I think you can load the textures more easily. First create your CGImageRef:
CGImageRef image0 =
[[UIImage imageNamed:#"image0.png"] CGImage];
Load that into a GLKTextureInfo ivar textureInfo0 as follows:
self.textureInfo0 = [GLKTextureLoader
textureWithCGImage:image1
options:[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES],
GLKTextureLoaderOriginBottomLeft, nil]
error:NULL];
Do exactly the same with your second texture. That should be named textureInfo1.
Loading the texture coords is done with glEnableVertexAttribArray and glVertexAttribPointer using the GLKVertexAttribTexCoord0 attribute. I think you have that set up above. You will need to map the coordinates from your texture to the target triangles. For example, if you wanted to draw a square - which composes of two triangles - you would need to map the S and T coordinates of the source image twice, i.e. for each respective triangle. For example triangle 1 with coords:
{0.0f, 0.0f}, {1.0f, 0.0f}, {0.0f, 1.0f}.
(if you plot that you will see it is a triangle)
Then, the second triangle of the square would be as follows:
{1.0f, 0.0f}, {0.0f, 1.0f}, {1.0f, 1.0f}.
Next, in your drawing method;
- (void)glkView:(GLKView *)view drawInRect:(CGRect)rect
first, set up the initial texture:
self.effect.texture2d0.name = self.textureInfo0.name;
self.effect.texture2d0.target = self.textureInfo0.target;
[self.effect prepareToDraw];
Draw your triangles as need be with glDrawArrays. Then, replace with the second texture:
self.effect.texture2d0.name = self.textureInfo1.name;
self.effect.texture2d0.target = self.textureInfo1.target;
[self.effect prepareToDraw];
and draw the rest of the triangles with glDrawArrays. They will use the second texture.
Continue replacing self.effect.texture2d0.name and self.effect.texture2d0.target with different GLKTextureInfo instances to draw the different textures.

Related

iPhone OpenGL ES Paint App Brush Effect

I'm developing painting app [taking reference from GLPaint app] for iPhone and iPad.
I am working on brush effect. I want to get brush effect for my paint app as shown in Image1
I've got brush stroke similar to image 2
I am using following code for brush texture:
CGImageRef brushImage;
CGContextRef brushContext;
GLubyte *brushData;
size_t width, height;
if(UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPad)
{
brushImage = [UIImage imageNamed:#"flower#2x.png"].CGImage;
}
else {
brushImage = [UIImage imageNamed:#"flower.png"].CGImage;
}
width = CGImageGetWidth(brushImage) ;
height = CGImageGetHeight(brushImage) ;
if(brushImage) {
brushData = (GLubyte *) calloc(width * height * 4, sizeof(GLubyte));
brushContext = CGBitmapContextCreate(brushData, width, height, 8, width * 4, CGImageGetColorSpace(brushImage),kCGImageAlphaPremultipliedLast);
CGContextDrawImage(brushContext, CGRectMake(0.0, 0.0, (CGFloat)width, (CGFloat)height), brushImage);
CGContextRelease(brushContext);
glGenTextures(1, &brushTexture);
glBindTexture(GL_TEXTURE_2D, brushTexture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, brushData);
free(brushData);
}
CGFloat scale;
scale = self.contentScaleFactor;
glMatrixMode(GL_PROJECTION);
CGRect frame = self.bounds;
glLoadIdentity();
glOrthof(0, (frame.size.width) * scale, 0, (frame.size.height) * scale, -1, 1);
glViewport(0, 0, (frame.size.width) * scale, (frame.size.height) * scale);
glMatrixMode(GL_MODELVIEW);
glDisable(GL_DITHER);
glEnable(GL_BLEND);
glEnable(GL_TEXTURE_2D);
glEnableClientState(GL_VERTEX_ARRAY);
glEnable(GL_BLEND);
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_POINT_SPRITE_OES);
glTexEnvf(GL_POINT_SPRITE_OES, GL_COORD_REPLACE_OES, GL_TRUE);
glPointSize(width / kBrushScale);
// Define a starting color
HSL2RGB((CGFloat) 0.0 / (CGFloat)kPaletteSize, kSaturation, kLuminosity, &components[0], &components[1], &components[2]);
glColor4f(components[0] * kBrushOpacity, components[1] * kBrushOpacity, components[2] * kBrushOpacity, kBrushOpacity);
I've been searching for code related to different paint brush stroke but I cannot find any code. Help me to get "desired" brush stroke similar to image1.
Do not use GL_POINT_SPRITE_OES mode. Draw sprites via standard triangles. Then it needs to bind sprite texture coords to the target output coords and make sprite texture repeatable.
Assuming sprite texture size is 32x32. The default texture coords is in rect {{0,0},{1.0,1.0}}. For drawing sprite at {x,y} position, it needs to use sprite texture coord based on rect {{(x%32)/32.0, (y%32)/32.0},{1.0, 1.0}}. This way it prevents sprite content from smudging.

OpenGL ES on iPhone doesn't draw anything

I'm an absolute beginner in OpenGL ES programming in iOS.
This is my first attempt to draw some simple 2D primitives with OpenGL ES onto a view.
Here is the class declaration:
#interface OGLGameCanvas : UIView <GameCanvas> {
EAGLContext* context;
Game* game;
GLuint framebuffer, renderbuffer, depthbuffer;
}
Here is my initialization code:
- (void)initialize {
// Get the layer and set properties
CAEAGLLayer* layer = (CAEAGLLayer*)self.layer;
layer.opaque = NO;
layer.drawableProperties = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:NO], kEAGLDrawablePropertyRetainedBacking, kEAGLColorFormatRGBA8, kEAGLDrawablePropertyColorFormat, nil];
// Set the context
context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES1];
if (!context || ![EAGLContext setCurrentContext:context])
DLog(#"Cannot create EAGLContext!");
// Create the color buffer and the render buffer
glGenFramebuffers(1, &framebuffer);
glGenRenderbuffers(1, &renderbuffer);
glBindFramebuffer(GL_FRAMEBUFFER, framebuffer);
glBindRenderbuffer(GL_RENDERBUFFER, renderbuffer);
[context renderbufferStorage:GL_RENDERBUFFER fromDrawable:(CAEAGLLayer*)layer];
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, renderbuffer);
GLenum status = glCheckFramebufferStatus(GL_FRAMEBUFFER);
if (status != GL_FRAMEBUFFER_COMPLETE)
NSLog(#"Failed to make complete frame buffer: %x", status);
// Get width and height of the render buffer
GLint width, height;
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &width);
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &height);
// Create and start animation loop
CADisplayLink* displayLink = [CADisplayLink displayLinkWithTarget:self selector:#selector(drawFrame:)];
[displayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
}
and my drawing code:
- (void)drawFrame:(CADisplayLink*)sender {
glLoadIdentity();
glClearColor(0.0, 0.0, 0.0, 1.0);
glClear(GL_COLOR_BUFFER_BIT);
glEnableClientState(GL_VERTEX_ARRAY);
GLfloat vertices[] = { -20.0f, 2.5f, 0.0f, 0.0f, 1.5f, 7.5f };
glVertexPointer(2, GL_FLOAT, 0, vertices);
glColor4f(1.0, 0.0, 0.0, 1.0);
glPointSize(5.0);
glDrawArrays(GL_POINTS, 0, 3);
glDisableClientState(GL_VERTEX_ARRAY);
[context presentRenderbuffer:GL_RENDERBUFFER];
}
The canvas gets cleared (in fact, it becomes black, or red, or whatever I set into glClearColor), but no points are drawn.
I'm pretty sure I'm forgetting something basic and essential.
Thanks for your help.
The problem is here:
GLfloat vertices[] = { -20.0f, 2.5f, 0.0f, 0.0f, 1.5f, 7.5f };
The normalized device coordinates lay in range [-1..1] so you're drawing them outside the visible area.
And here:
You have to do this before drawing.
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();

OpenGL OES Iphone glCopyTexImage2D

I am new to openGL OES on iPhone and have a memory issue with glCopyTexImage2D. So far i understood, this function should copy the current framebuffer to the binded texture. But for some reason it always allocates new memory, which i can see in instruments checking the allocations.
My goal is to read texture images and draw on it, after drawing i want to save the new texture , so i can scroll through the painting. So here is may code:
1) init opengl and framebuffer:
context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES1];
if (!context || ![EAGLContext setCurrentContext:context]) {
[self release];
return nil;
}
glEnable(GL_TEXTURE_2D);
glEnable(GL_BLEND);
glBlendFunc(GL_ONE, GL_SRC_COLOR);
// Setup OpenGL states
glMatrixMode(GL_PROJECTION);
CGRect frame = self.bounds;
CGFloat scale = self.contentScaleFactor;
// Setup the view port in Pixels
glOrthof(0, frame.size.width * scale, 0, frame.size.height * scale, -1, 1);
glViewport(0, 0, frame.size.width, frame.size.height * scale);
glDisable(GL_DEPTH_TEST);
glDisable(GL_DITHER);
glMatrixMode(GL_MODELVIEW);
glEnableClientState(GL_VERTEX_ARRAY);
// Set a blending function appropriate for premultiplied alpha pixel data
glEnable(GL_POINT_SPRITE_OES);
glTexEnvf(GL_POINT_SPRITE_OES, GL_COORD_REPLACE_OES, GL_TRUE);
glPointSize(64 / kBrushScale);
2) now i load the saved images into the framebuffer:
if ([[NSFileManager defaultManager] fileExistsAtPath:path]) {
// load texture
NSData* data = [[NSData alloc] initWithContentsOfFile:path];
glGenTextures(1, &drawBoardTextures[i]);
glBindTexture(GL_TEXTURE_2D, drawBoardTextures[i]);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 1024, 1024, 0, GL_RGBA, GL_UNSIGNED_BYTE, [data bytes]);
// free memory
[data release];
}
3) and finally render the texture:
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
glColor4f(1.0f, 1.0f, 1.0f, 1.0f);
int width = 1024;
GLfloat quad[] = {0.0,1024.0,1024.0,1024.0,0.0,0.0,1024.0,0.0};
GLfloat quadTex[] = {0.0,1.0,1.0,1.0,0.0,0.0,1.0,0.0};
for (int i=0; i<10; i++) {
quad[0] = width * i;
quad[2] = quad[0] + width;
quad[4] = quad[0];
quad[6] = quad[2];
glBindTexture(GL_TEXTURE_2D, drawBoardTextures[i]);
glVertexPointer(2, GL_FLOAT, 0, quad);
glTexCoordPointer(2, GL_FLOAT, 0, quadTex);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
glBindTexture(GL_TEXTURE_2D, 0);
}
4) for now everything works fine, with gltranslatef i can scroll through the textures and also there is no allocation yet observed in instruments. so now i draw on the current window and want to save the result like followed:
int texIndex = offset.x/1024;
float diff = offset.x - (1024*texIndex);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glBindTexture(GL_TEXTURE_2D, drawBoardTextures[texIndex]);
glBindTexture(GL_TEXTURE_2D, drawBoardTextures[texIndex]);
glCopyTexSubImage2D(GL_TEXTURE_2D, 0, diff, 0, 0, 0, 1024-diff, 1024);
glBindTexture(GL_TEXTURE_2D, drawBoardTextures[texIndex + 1]);
glCopyTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, 1024-diff, 0, diff, 1024);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glFlush();
No the problems starts. instead of writing it directly into the generated textures, it writes it into client memory. for every copied texture it uses ~4 MB of Ram, but every recopy doesn't need any memory. i really don't know what i did wrong.
Does anyone know what the problem is? Thanks alot for your help.
cheers
chris

Draw a line on top of triangles

I created a new iPhone OpenGL Project in Xcode. I filled my background with triangles and gave them a texture, see below:
CGImageRef spriteImage;
CGContextRef spriteContext;
GLubyte *spriteData;
size_t width, height;
// Sets up matrices and transforms for OpenGL ES
glViewport(0, 0, backingWidth, backingHeight);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
//glRotatef(-90,0,0,1);
glOrthof(-1.0f, 1.0f, -1.5f, 1.5f, -1.0f, 1.0f);
glMatrixMode(GL_MODELVIEW);
// Clears the view with black
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
// Sets up pointers and enables states needed for using vertex arrays and textures
glVertexPointer(2, GL_FLOAT, 0, vertices);
glEnableClientState(GL_VERTEX_ARRAY);
//glColorPointer(4, GL_FLOAT, 0, triangleColors);
//glColor4f(0.0f,1.0f,0.0f,1.0f);
//glEnableClientState(GL_COLOR_ARRAY);
glTexCoordPointer(2, GL_FLOAT, 0, spriteTexcoords);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
// Creates a Core Graphics image from an image file
spriteImage = [UIImage imageNamed:#"Bild.png"].CGImage;
// Get the width and height of the image
width = CGImageGetWidth(spriteImage);
height = CGImageGetHeight(spriteImage);
// Texture dimensions must be a power of 2. If you write an application that allows users to supply an image,
// you'll want to add code that checks the dimensions and takes appropriate action if they are not a power of 2.
if(spriteImage) {
// Allocated memory needed for the bitmap context
spriteData = (GLubyte *) calloc(width * height * 4, sizeof(GLubyte));
// Uses the bitmap creation function provided by the Core Graphics framework.
spriteContext = CGBitmapContextCreate(spriteData, width, height, 8, width * 4, CGImageGetColorSpace(spriteImage), kCGImageAlphaPremultipliedLast);
// After you create the context, you can draw the sprite image to the context.
CGContextDrawImage(spriteContext, CGRectMake(0.0, 0.0, (CGFloat)width, (CGFloat)height), spriteImage);
// You don't need the context at this point, so you need to release it to avoid memory leaks.
CGContextRelease(spriteContext);
// Use OpenGL ES to generate a name for the texture.
glGenTextures(1, &spriteTexture);
// Bind the texture name.
glBindTexture(GL_TEXTURE_2D, spriteTexture);
// Set the texture parameters to use a minifying filter and a linear filer (weighted average)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
// Specify a 2D texture image, providing the a pointer to the image data in memory
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, spriteData);
// Release the image data
free(spriteData);
// Enable use of the texture
glEnable(GL_TEXTURE_2D);
// Set a blending function to use
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
// Enable blending
glEnable(GL_BLEND);
I have got two questions, bc. I am not so familiar with OpenGL.
I want to write a method, which I give two points as parameters and I want a Line between these two points to be drawn above my triangles (background).
- (void) drawLineFromPoint1:(CGPoint)point1 toPoint2:(CGPoint)point2 {
GLfloat triangle[] = { //Just example points
0.0f, 0.0f,
0.1f, 0.0f,
0.1f, 0.0f,
0.1f, 0.1f
};
GLfloat triangleColors[] = {
0.5f, 0.5f, 0.5f, 1.0f
};
//now draw the triangle
}
Something like that. Now I want to have a 2nd method, which erases this line (and not the background)
My drawing method looks like this:
- (void)drawView
{
// Make sure that you are drawing to the current context
[EAGLContext setCurrentContext:context];
glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
glClear(GL_COLOR_BUFFER_BIT);
glDrawElements(GL_TRIANGLES, number_vertices, GL_UNSIGNED_SHORT, indices);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER_OES];
}
Would be great if you can give e some hints/help,
cheers
The conventional approach would be to redraw everything whenever you move or erase a line.
Well, I got it to work. I just missed to set the Vertex-Pointer in my drawView to my triangles. This here now works:
- (void)drawView
{
[EAGLContext setCurrentContext:context];
glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
glClear(GL_COLOR_BUFFER_BIT);
glVertexPointer(2, GL_FLOAT, 0, vertices);
glDrawElements(GL_TRIANGLES, number_vertices, GL_UNSIGNED_SHORT, indices);
[self drawLines];
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER_OES];
}
- (void) drawLines{
glDisable(GL_TEXTURE_2D);
GLfloat points[4];
for (Dataset *data in buttons) {
CGPoint s = [data screenPosition];
CGPoint p = [data slot];
points[0] = (GLfloat)(768-s.y);
points[1] = (GLfloat)(1024-s.x);
points[2] = (GLfloat)(768-p.y);
points[3] = (GLfloat)(1024-p.x);
glVertexPointer(2, GL_FLOAT, 0, points);
glDrawArrays(GL_LINE_STRIP, 0, 2);
}
glEnable(GL_TEXTURE_2D);
}

Drawing a texture with an alpha channel doesn't work -- draws black

I am modifying GLPaint to use a different background, so in this case it is white. Anyway the existing stamp they are using assumes the background is black, so I made a new background with an alpha channel. When I draw on the canvas it is still black, what gives? When I actually draw, I just bind the texture and it works. Something is wrong in this initialization.
Here is the photo
- (id)initWithCoder:(NSCoder*)coder
{
CGImageRef brushImage;
CGContextRef brushContext;
GLubyte *brushData;
size_t width, height;
if (self = [super initWithCoder:coder])
{
CAEAGLLayer *eaglLayer = (CAEAGLLayer *)self.layer;
eaglLayer.opaque = YES;
// In this application, we want to retain the EAGLDrawable contents after a call to presentRenderbuffer.
eaglLayer.drawableProperties = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kEAGLDrawablePropertyRetainedBacking, kEAGLColorFormatRGBA8, kEAGLDrawablePropertyColorFormat, nil];
context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES1];
if (!context || ![EAGLContext setCurrentContext:context]) {
[self release];
return nil;
}
// Create a texture from an image
// First create a UIImage object from the data in a image file, and then extract the Core Graphics image
brushImage = [UIImage imageNamed:#"test.png"].CGImage;
// Get the width and height of the image
width = CGImageGetWidth(brushImage);
height = CGImageGetHeight(brushImage);
// Texture dimensions must be a power of 2. If you write an application that allows users to supply an image,
// you'll want to add code that checks the dimensions and takes appropriate action if they are not a power of 2.
// Make sure the image exists
if(brushImage)
{
brushData = (GLubyte *) calloc(width * height * 4, sizeof(GLubyte));
brushContext = CGBitmapContextCreate(brushData, width, width, 8, width * 4, CGImageGetColorSpace(brushImage), kCGImageAlphaPremultipliedLast);
CGContextDrawImage(brushContext, CGRectMake(0.0, 0.0, (CGFloat)width, (CGFloat)height), brushImage);
CGContextRelease(brushContext);
glGenTextures(1, &brushTexture);
glBindTexture(GL_TEXTURE_2D, brushTexture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, brushData);
free(brushData);
}
//Set up OpenGL states
glMatrixMode(GL_PROJECTION);
CGRect frame = self.bounds;
glOrthof(0, frame.size.width, 0, frame.size.height, -1, 1);
glViewport(0, 0, frame.size.width, frame.size.height);
glMatrixMode(GL_MODELVIEW);
glDisable(GL_DITHER);
glEnable(GL_TEXTURE_2D);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_DST_ALPHA);
glEnable(GL_POINT_SPRITE_OES);
glTexEnvf(GL_POINT_SPRITE_OES, GL_COORD_REPLACE_OES, GL_TRUE);
glPointSize(width / kBrushScale);
}
return self;
}
Your line here:
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_DST_ALPHA);
Needs to be:
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
One minus the destination alpha can be 0, since the dst alpha is usually 1.