iOS - off-screen rendering using OpenGL ES2 - iphone

I want to implement image edition using OpenGL shaders. I have found some examples how to implement off-screen rendering using OpenGL ES1.
Do you now any example about off-screen rendering using OpenGL ES2 ans shaders on iPhone?
Thank you in advance

You need to use the framebuffer object extension (FBO), which is part of OpenGL ES2.
This is the same way as with OpenGL ES 1.0, except the functions lose their OES suffix (because FBO was an OES extension to ES1, not a part of the core).
You might like http://programming4.us/multimedia/3288.aspx this tutorial. The code is pretty simple and should be pretty easy to use it with GLES2.

Related

How to use custom shaders together with GLKit

I keep reading how wonderfully easy it is to work with GLKit and your own custom shaders. But, so far, I have failed to find any information on how to actually do it. How can I take my own shader and "plug it in" into existing GLKit project?
Well, you can look at this blog, which uses GLKit to build a basic OpenGL ES 2.0 application. There's also links to other blogs if you're looking to dig more into it:
GLKit + OpenGL ES 2.0 + iOS5 Programming blog
The only thing it doesn't cover is GLKBaseEffect, but if you want to build custom shaders like you said, you definitely don't want to use it anyway.
GLKit provides 4 basic things:
A math library (Matrices, verctors...)
A View/Controller combo made especially for drawing OpenGL content
A texture loader class (GLKTextureLoader)
GLKBaseEffect, which mimics OpenGL 1.0's fixed pipeline

ES 2.0 Multi-Pass & Render to Texture Implementation

I need help setting up multi-pass rendering with OpenGL ES 2.0 on the iPhone. I haven't been able to find an example which implements both rendering to a texture and multi-pass shading.
I'm looking for some instructions and sample code which implement:
First stage: Render to a texture
Second stage: Input that texture and render to screen
I have referenced Apple's OpenGL ES Programming Guide, OpenGL Shading Language (Orange Book), and O'Reilly's iPhone 3D Programming Book.
The Orange Book discusses deferred shading and provides two shader programs for first-pass and second-pass rendering, but doesn't provide example code to setup that application or show how to communicate data between both shaders.
Questions:
How to render to texture?
Using glDrawElements
How to input that texture to the next pass?
How to implement two shading programs?
How to alternate first- and second-pass shading programs?
Need to attach, detach, and call 'use' for each pass?
How to implement multi-pass shading?
I wrote a short example of doing just this (multiple render-to-texture passes on the iPhone using OpenGL ES 2.0) a few weeks ago: http://www.mat.ucsb.edu/a.forbes/blog/?p=245
**
Edit, this post is a bit old, and it has moved here:
http://blog.angusforbes.com/openglglsl-render-to-texture/
**
Ok, first of all: I'm no expert on OpenGL ES 2.0. I was kind of in the same situation where I wanted to do a multipass render setup, in one of my first OpenGL ES applications.
I also used the Orange Book. Check chapter 12. Framebuffer Objects > Examples. The first example demonstrates how to use a framebuffer to render to a texture, and then draws that texture to screen.
Basically using that example I created an application that renders some geometry to a texture using an effect shader, then renders that texture to screen, layered with some other content all using a different shader.
I'm not sure if this is the best approach, but it works for my purposes. My setup:
I create two framebuffers, the default and an offscreen one. Same for the renderbuffers
I create a texture which the app will render to
I bind the offscreen framebuffer, and attach the texture to it using glFramebufferTexture2D
My rendering:
bind the offscreen framebuffer.
use my first shader program
draw my geometry
bind the default framebuffer
use my second shader program
draw a fullscreen quad with the texture attached to it.

How does the OpenGL ES template work for the iPhone?

So, I've been trying to figure out why the square is moving up and down the iPhone simulator when I Build and Run the template that Apple provides for OpenGL ES. I don't understand why for example they have ES1Render.m, and ES2Render.m instead of just one ESRender.m. Also, where is the equivalent of the glutDisplayFunc, and glutTimerFunc? Thanks in advance.
They are trying to the show the two versions of OpenGL ES. One uses shaders (v2) and the other (v1) uses older OpenGL technology. In the ES2 renderer I believe they are doing all the movement in the shader code. If you want something that looks like older OpenGL code try setting it to use the version 1 renderer. Then you can use stuff like the older demos on http://nehe.gamedev.net/. You just need to fill in the "render" function with your drawing code.
EAGLView has a timer which sets the frame rate, but there is a method which allows you to set it to be whatever you like.

Does OpenGL ES support environment shaders?

I want to make metallic 3d object that appears to be reflective. I want to accomplish this using an environment shader that uses either a sphere or cube map that I can assign an image or texture as the "reflection" source.
Does OpenGL ES on the iPhone support this in any versions?
OpenGL ES 2.0 provides shader support. However, it isn't available in many mobile devices that are on the market today. It would be important for you to code both ES 1.1 and ES 2.0 versions of the graphics.
Apple Dev Center has tons of information on the transition:
The fixed-function pipeline of OpenGL
ES 1.1 provides good baseline behavior
for a 3D graphics pipeline, from
transforming and lighting vertices to
blending the final pixels with the
framebuffer. If you choose to
implement an OpenGL ES 2.0
application, you will need to
duplicate this functionality. On the
other hand, OpenGL ES 2.0 is more
flexible than OpenGL ES 1.1. Custom
vertex and fragment operations that
would be difficult or impossible to
implement using OpenGL ES 1.1 can be
trivially implemented with an OpenGL
ES 2.0 shader. Implementing a custom
operation in an OpenGL ES 1.1
application often requires multiple
rendering passes and complex changes
to OpenGL ES state that obscure the
intent of the code. As your algorithms
grow in complexity, shaders convey
those operations more clearly and
concisely and with better performance.
http://developer.apple.com/iphone/library/documentation/3DDrawing/Conceptual/OpenGLES_ProgrammingGuide/DeterminingOpenGLESCapabilities/DeterminingOpenGLESCapabilities.html#//apple_ref/doc/uid/TP40008793-CH102-SW1
In the old days "metallic" look was achieved using technique called "environment mapping" or "reflection mapping".
Since no programmable shaders are available for OpenGL ES 1.1, simple reflection mapping can be done with software. Just transform vertex normals according to reflection source/camera and get texture UV-coordinates from transformed normal vector. iPhone has horsepower to do this easily, at least with decent vertex counts.
OpenGL ES Supports most of the features of OpenGL (and some extra features for mobile devices). If I recall correctly the iPhone 3Gs supports fragment shaders, while the older iPhone 3G just supports a fixed pipeline.

Is OpenGL required for my iPhone game?

On an iPhone:
If I am writing a game that has multiple levels, with multiple animations (image sequences), jpg and png (transparent), some full screen and some not, some looped and some played once only. What is the best way of doing it? Each level might have up to 10MB of images. Add on to this music, and video (cut scenes). All 2D graphics, no 3D models.
Is OpenGL required? Or can this be achieved with Quartz or Core Animation?
I do similar using UIViews and a bit of Core Graphics (Quartz 2D) and it works fine. I've found the custom drawing in Core Graphics pushes it a bit further, tho - UIViews work best when given images rather than having to draw themselves. Also watch out for lots of transparencies. You'll probably find that large or long (many frame) animations will be the killer, though. There are some techniques for minimising the impact of the animations which involves allowing it to purge images from memory if not being immediately displayed (I forget the setting). This may result in your animations not being as smooth as you they would otherwise be (not sure if Open GL ES would help here, though).
You should probably prototype using UIViews, and decide then if it's worth doing the extra work for OpenGL ES. Also, if you're not already familiar with OpenGL/ Open GL ES it's a steep learning curve.
I've used both Quartz and OpenGL to do graphics on the iPhone, and while OpenGL has a much higher learning curve, it gives much better performance than Quartz. Let's say you have a scene that involves drawing 6 large, semi-transparent images on top of each other. Quartz will do it, but you'll probably get 15fps at best. OpenGL takes advantage of the iPhone's PowerVR chip and the drawing is hardware accelerated - so you can load those images into OpenGL textures and render at 25-30fps no problem.
I would agree with Phil though - try doing it using Quartz and see if it meets your needs. OpenGL is extremely powerful but it's API lacks some of the convenience features of Quartz (such as saving/restoring graphics state).
One another note entirely, you might want to take a look at Unity's iPhone development tools (http://unity3d.com/#iphone). They leverage OpenGL but provide you with an IDE to create your game. It abstracts away all of the graphics-level code, so you can focus on the high-level gameplay. My brother uses it to write iPhone games, and it's extremely cool.
I recommend having a look at Cocos2D iPhone.
cocos2d for iPhone is a framework for building 2D games, demos, and other graphical/interactive applications. It is based on the cocos2d design: it uses the same API, but instead of using python it uses objective-c.
Most likely OpenGl.
One advantage of using OpenGL ES would be that the investment of time for learning the technology could be applied to other platforms/contexts and your game is potentially more port-friendly. These may not be important to you.
I would suggest using Quartz. OpenGL ES is really best for 3d stuff. However both work fairly well, so if you already know OpenGL ES, it's fine to use that.
You should consider using a lot of less resources in your game, Apple recommends not to use more than 10 mb in texture for openGL apps.
Try texture atlas, reuse graphics, tile based graphics...but avoid to use to much graphic assets.