I am trying to do a highlighting action on the iPhone. I can draw lines an such with the finger but now I want to be able to highlight over that. Any algorithms out there you can point me to that does that? Using OpenGL ES off course.
Thanks
This should produce a pink highlight around your geometry:
render_model();
glDisable(GL_LIGHTING);
glDisable(GL_TEXTURE_2D);
glColor3f(1.0f, 0.0f, 1.0f);
glLineWidth(2);
glCullFace(GL_FRONT);
render_model_using_GL_LINES();
glCullFace(GL_BACK);
glEnable(GL_TEXTURE_2D);
Related
Is there a sprite kit way to do the Cocos2d draw method?
-(void)draw
{
ccDrawColor4B(255, 255, 255, 255);
ccDrawCircle(mySprite.position, attackRange, 360, 30, false);
[super draw];
}
Thank you!
There's no custom OpenGL drawing in Sprite Kit (as of iOS 7.1).
While you can draw circles and other shapes using SKShapeNode, they are meant primarily for debugging purposes (analog to the ccDraw functions in cocos2d). The main problem being that shape nodes are not drawn in batches (inefficiently), unlike sprites.
My render to texture iPhone code only works if I disable MSAA, otherwise all I get is a black texture. What could be the cause of the problem?
Here is my code:
glViewport(0,0, target->_Width, target->_Height);
glClear(GL_COLOR_BUFFER_BIT Or GL_DEPTH_BUFFER_BIT Or GL_STENCIL_BUFFER_BIT);
glBindTexture(GL_TEXTURE_2D, target->_Handle);
// render stuff here
glCopyTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 0, 0, target->_Width, target->_Height, 0);
glBindTexture(GL_TEXTURE_2D, 0);
Apparently, when you are using MSAA for your main framebuffer, you have to use it for any other FBOs you want to render to as well. Since GL_TEXTURE_2D_MULTISAMPLE is not available on OpenGL ES 2, the solution I have found is quite simply to apply the same modifications you need to go from regular rendering to MSAA rendering, to your render-to-texture code as well.
You need 3 additional buffers: a multi-sampled color renderbuffer, a multi-sampled depth renderbuffer, and a new FBO to attach them to. Bind the new FBO instead of the texture FBO before rendering. After rendering, resolve the new MSAA FBO into the texture FBO, the same way you do in your main rendering code using glResolveMultisampleFramebufferAPPLE().
Note that for some reason, texture rendering with enabled MSAA works without these modifications in the simulator. Maybe it uses GL_TEXTURE_2D_MULTISAMPLE automatically?
The cause of your problem is that you cannot render to a multisampled texture in OpenGL ES. Indeed, if I recall correctly, multisampled textures don't exist in OpenGL ES. Desktop OpenGL allows you to do it, but it introduces a whole new texture target (GL_TEXTURE_2D_MULTISAMPLE) in order to do it.
A buffer that offers multisampling is not the same thing as a regular texture; that's why Desktop OpenGL uses a special texture target, which has its own GLSL sampler type.
So I just started working on something in Open GL ES 2.0. I get the general impression that the switch from the 1.1 template to the 2.0 template in Xcode has caused some confusion for everyone, so as a result, there isn't much helpful on 2.0 (if there is anything really good and informative out there like 71squared's videos on the 1.1 template except for 2.0, your welcome to post a link to it).
My problem is displaying an image on the screen.
Right now, I've got this in my drawFrame method.
[(EAGLView *)self.view setFramebuffer];
// Replace the implementation of this method to do your own custom drawing.
static float transY = 0.0f;
glClearColor(0.0f, 1.0f, 0.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
[playerCube drawAtPoint:CGPointMake(160.0f, 240.0f)];
if ([context API] == kEAGLRenderingAPIOpenGLES2)
{
// Use shader program.
glUseProgram(program);
// Update uniform value.
glUniform1f(uniforms[UNIFORM_TRANSLATE], (GLfloat)transY);
transY += 0.075f;
// Validate program before drawing. This is a good check, but only really necessary in a debug build.
// DEBUG macro must be defined in your debug configurations if that's not already the case.
if (![self validateProgram:program])
{
NSLog(#"Failed to validate program: %d", program);
return;
}
}
else
{
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glTranslatef(0.0f, (GLfloat)(sinf(transY)/2.0f), 0.0f);
transY += 0.075f;
}
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
[(EAGLView *)self.view presentFramebuffer];
Essentially the template without the square bouncing around. This is in my ViewController.m by the way. That works fine on it's own. I've not had much luck with the code for displaying the image however. I'm using the Texture2D file from Apple's crash landing example, but I haven't had much luck with that no matter where the image related code is put.
I've used playerCube = [[Texture2D alloc] initWithImage:[UIImage imageNamed:#"Cubix.png"]]; to allocate it, which is probably right even if I've not been putting it in the right location, and [playerCube drawAtPoint:CGPointMake(160.0f, 240.0f)];.
Wait, that wouldn't show anything at the start cuz I've not set the height and width of the image right? I guess I sort of answered that part of my question. Would it work if I used something like image.frame = CGRectMake (); or would that not work with Texture 2D? Also where should I be putting this code for it to work as it would?
Hope I made some sense in this. I've never posted a question on StackOverflow before.
I show how to display an image as a texture in OpenGL ES 2.0 on the iPhone in this example, where the image is a frame of video from the camera, and this example, where I pull in a PVR-compressed texture as an image. The former example is described in the writeup here.
I describe how the vertex and fragment shaders work, along with all the supporting code for them, in the video for the OpenGL ES 2.0 class which is part of my freely available iOS development course on iTunes U. Additionally, I highly recommend you read Jeff LaMarche's series of posted chapters from his unpublished OpenGL ES 2.0 book.
In short, once you have the texture loaded in, you'll need to attach it as a uniform to your shader program using code like the following:
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, myTexture);
glUniform1i(myTextureUniform, 0);
Within your fragment shader, you'll need to define the texture uniform:
uniform sampler2D myTexture;
and then sample the color from the texture at the appropriate point:
gl_FragColor = texture2D(myTexture, textureCoordinate);
Again, I go into this in more detail within my class, and you can use these examples as starting points to work from.
Texture2D, as presented in Crash Landing, isn't ging to work with ES 2.x. It should correctly upload your texture, but the two drawing methods, drawAtPoint: and drawInRect: won't work. They rely on the GL calls glVertexPointer and glTexCoordPointer; neither of those survives into ES 2.x. The logic behind that is that ES 1.x has a fixed pipeline, so it has a defined slot for vertex positions (which you supply with glVertexPointer), a defined slot for texture coordinates (via glTexCoordPointer) and a series of predefined ways in which that data can end up being processed. ES 2.x is fully programmable so it has no need to supply one way for providing vertices, another for providing texture coordinates, a third for colours, etc. It has just one way to supply an attribute for a vertex and it is up to you to specify how that relates to your vertex program, then up to your vertex program to figure out what needs to be processed and passed on to your fragment shader.
Really you shouldn't be conflating providing textures and providing geometry anyway. Crash Landing was withdrawn because it suggests poor programming patterns — my personal suspicion is that this was one of them.
In addition to not having fixed attributes of a vertex, ES 2.x doesn't supply any fragment shaders. If you want to texture map, you have to write suitable vertex and fragment shaders. The ones in the new GL template do Gouraud shading only if I recall.
An additional issue hinted at in the code is that ES 2.x doesn't have a matrix stack. You need to do something yourself about communicating to your vertex shader how it should map from the original vertices to screen locations. I suspect that if you just want to draw a texture as though 2d, that isn't so much of a problem.
Probably the correct route forward is:
modify Texture2D so that it can bind the texture it has loaded and remove all attempts to draw it
modify the vertex and fragment shader supplied in the GL template to do texturing and modify the buffer you submit so that it has texture coordinates in it
then modify the vertex shader so that you can do something other than a sine animation
If you're already happy with ES 1.x, probably a good place to start is the ES 2.x reference sheet and maybe the lighthouse3d GLSL tutorial — though that's for desktop OpenGL 2.0. On desktop GL 2.0 there's a lot of support built into GLSL for emulating the old fixed pipeline. None of that is in ES 2.x. But it should at least explain things about attributes, uniforms, samplers, etc. I'm afraid I've yet to come across an ES 2.x focussed tutorial.
How can I draw an Ellipse using openGL ES
There are many good tutorials that help you to learn Open GL ES. These tutorials particularly give examples to draw Ellipse, Circle, Square etc..
Some are here,
OpenGL ES for iPhone tutorial - By Simon Maurice
Open GL 2.0 for iPhone tutorial - by Ray Wenderlich
Resources for iPhone OpenGL ES - Beginners / Intermediate
Best Wishes
Draw a bunch of line segments around the circumference.
Also note that your system may not support anti-aliased lines (the iPhone doesn't). There's a nice article on drawing antialiased lines here. You can do something similar to draw the line segments (though you don't need the end-caps, so it's a bit simpler).
Another solution to anti-aliasing that requires more polygons is here.
I would like to draw 2d shapes like this in an iPhone app:
alt text http://www.shaggyfrog.com/junk/beveled-circle.jpg
I asked a similar question here to see if I could do it easily with Quartz, but didn't get a solution. So I got to thinking that I might be able to leverage an exsiting 2d library out there, and then I thought of cocos2d.
The goal is to draw these kinds of beveled shapes dynamically, i.e., using arbitrary colours, and possibly with the highlight/bevel drawn at an arbitrary position.
Is this possible with cocos2d?
As far as my knowledge of cocos2d goes, cocos2d will not enable you to do this in any other way than OpenGL would allow you to do. Cocos2d uses OpenGL under the hood. Cocos2d comes with no built-in set for creating such graphics.
Since the bevel is used to create a 3D effect, perhaps you shouldn't be looking at simulating it with 2D drawing, but instead use a 3D drawing library? OpenGL would certainly be capable of drawing such shapes. Cocos2d focuses on 2D drawing instead of 3D.
I'm not sure if Cocos2D would allow for a custom object to draw 3D using the underlying OpenGL mechanism. I have never tried.
Wouldn't it instead be easier to create the image in photoshop and adjust colors dynamicly? I'm not sure what you are trying to do.
You could also create a mask shape with a transparent "bevel effect" and scale that along with the image you need to have shine?
Aside from the bevel effect, if you want to "colorize" each semi-circle, you can use [sprite setColor:] or sprite.color = ccc3(r,g,b)
CCSprite *sprite = [CCSprite spriteWithSpriteSheet:sheet rect:CGRectMake(32 * idx,0,128,32)];
[sprite setColor:ccc3(CCRANDOM_0_1()*255,CCRANDOM_0_1()*255,CCRANDOM_0_1()*255)];
You would design a "white semicircle" with beveled (gray) edges. Then you can make sprites and color each separately.