So, I've been trying to figure out why the square is moving up and down the iPhone simulator when I Build and Run the template that Apple provides for OpenGL ES. I don't understand why for example they have ES1Render.m, and ES2Render.m instead of just one ESRender.m. Also, where is the equivalent of the glutDisplayFunc, and glutTimerFunc? Thanks in advance.
They are trying to the show the two versions of OpenGL ES. One uses shaders (v2) and the other (v1) uses older OpenGL technology. In the ES2 renderer I believe they are doing all the movement in the shader code. If you want something that looks like older OpenGL code try setting it to use the version 1 renderer. Then you can use stuff like the older demos on http://nehe.gamedev.net/. You just need to fill in the "render" function with your drawing code.
EAGLView has a timer which sets the frame rate, but there is a method which allows you to set it to be whatever you like.
Related
I am trying to do dynamic drawing that draws a particle everywhere the finger is moving along the screen, exactly like in the game "Draw Something".
My intention is not to make a game, but I need this behavior for another app I am making.
Basically the behavior I am looking for is what the Apple sample code GLPaint does, simple drawing with the finger.
Now the question, I know that this can be done with OpenGL (GLPaint sample app does this), however I am new to OpenGL and I would like to use Core Graphics instead since it will only be 2D drawing.
Would it be possible to do this using Core Graphics only?, If so would performance be affected?. Some sample code for something like this using Core Graphics would be appreciated.
Is the best way to do this in Core Graphics to keep drawing a particle over and over again?.
Thank you,
Oscar
Have you considered using something like Cocos2D to achieve the OpenGL drawing? It's quite intuitive to pick up.
I'm sorry, I have no idea how easy/hard/performance hitting using CoreGraphics would be.
I keep reading how wonderfully easy it is to work with GLKit and your own custom shaders. But, so far, I have failed to find any information on how to actually do it. How can I take my own shader and "plug it in" into existing GLKit project?
Well, you can look at this blog, which uses GLKit to build a basic OpenGL ES 2.0 application. There's also links to other blogs if you're looking to dig more into it:
GLKit + OpenGL ES 2.0 + iOS5 Programming blog
The only thing it doesn't cover is GLKBaseEffect, but if you want to build custom shaders like you said, you definitely don't want to use it anyway.
GLKit provides 4 basic things:
A math library (Matrices, verctors...)
A View/Controller combo made especially for drawing OpenGL content
A texture loader class (GLKTextureLoader)
GLKBaseEffect, which mimics OpenGL 1.0's fixed pipeline
I want to implement image edition using OpenGL shaders. I have found some examples how to implement off-screen rendering using OpenGL ES1.
Do you now any example about off-screen rendering using OpenGL ES2 ans shaders on iPhone?
Thank you in advance
You need to use the framebuffer object extension (FBO), which is part of OpenGL ES2.
This is the same way as with OpenGL ES 1.0, except the functions lose their OES suffix (because FBO was an OES extension to ES1, not a part of the core).
You might like http://programming4.us/multimedia/3288.aspx this tutorial. The code is pretty simple and should be pretty easy to use it with GLES2.
I need help setting up multi-pass rendering with OpenGL ES 2.0 on the iPhone. I haven't been able to find an example which implements both rendering to a texture and multi-pass shading.
I'm looking for some instructions and sample code which implement:
First stage: Render to a texture
Second stage: Input that texture and render to screen
I have referenced Apple's OpenGL ES Programming Guide, OpenGL Shading Language (Orange Book), and O'Reilly's iPhone 3D Programming Book.
The Orange Book discusses deferred shading and provides two shader programs for first-pass and second-pass rendering, but doesn't provide example code to setup that application or show how to communicate data between both shaders.
Questions:
How to render to texture?
Using glDrawElements
How to input that texture to the next pass?
How to implement two shading programs?
How to alternate first- and second-pass shading programs?
Need to attach, detach, and call 'use' for each pass?
How to implement multi-pass shading?
I wrote a short example of doing just this (multiple render-to-texture passes on the iPhone using OpenGL ES 2.0) a few weeks ago: http://www.mat.ucsb.edu/a.forbes/blog/?p=245
**
Edit, this post is a bit old, and it has moved here:
http://blog.angusforbes.com/openglglsl-render-to-texture/
**
Ok, first of all: I'm no expert on OpenGL ES 2.0. I was kind of in the same situation where I wanted to do a multipass render setup, in one of my first OpenGL ES applications.
I also used the Orange Book. Check chapter 12. Framebuffer Objects > Examples. The first example demonstrates how to use a framebuffer to render to a texture, and then draws that texture to screen.
Basically using that example I created an application that renders some geometry to a texture using an effect shader, then renders that texture to screen, layered with some other content all using a different shader.
I'm not sure if this is the best approach, but it works for my purposes. My setup:
I create two framebuffers, the default and an offscreen one. Same for the renderbuffers
I create a texture which the app will render to
I bind the offscreen framebuffer, and attach the texture to it using glFramebufferTexture2D
My rendering:
bind the offscreen framebuffer.
use my first shader program
draw my geometry
bind the default framebuffer
use my second shader program
draw a fullscreen quad with the texture attached to it.
I want to do the following:
Tap the screen and draw 3 cricles around the the tapped point.
Is it better to do this with Core Animation or OpenGL ES?
Where do I start?
My experience is this: the more complex my app became, the more I realized I should have had used OpenGL ES for what I was trying to do.
So, for your situation, If what you described is all there is, sure, Core Graphics does the trick. But, I'm guessing there's more to it than three circles.
With no experience with OpenGL at all, the learning curve for ES was about 20 days.
Thus, my advice is: OpenGL ES for pretty much every frame-to-frame graphics based app.
As mentioned, the Core Graphics framework is probably what you want. A good way to go about it would be to subclass UIView, then override the two methods drawRect: and touchesEnded:withEvent:.
When a touch event ends on the UIView, you can get the point of the last touch from the event passed to touchesEnded:withEvent:, and store it somehow in the instance of your subclassed UIView.
Then, in your implementation of drawRect:, you'll get the stored last touch point, and draw three circles around it using three calls to CGContextAddEllipseInRect, as discussed here: Quartz 2D Programming Guide: Paths (registration as Apple Developer required).
The advantage of learning OpenGL ES is that the time you put in to learn it will serve you well in the future on iPhone Apps and on other devices.
In OpenGL ES, there's no built-in way to draw a circle, so use sine and cosine to build your circles out of line segments.
Core Graphics is definitely simpler, and better for 2D. OpenGL ES is made for 3D, but can also be used for 2D. Both can be used, so if you already know one, use that. It shouldn't really matter that much.
I already knew OpenGL, so I tend to use OpenGL ES even for 2D, but if you haven't used either before, go with Core Graphics.
This could best be done with Quartz 2D (also known as Core Graphics)
See Apple's Quartz programming guide