I'm aware that OpenGL textures on the the iphone are required to be a power of 2, is this true of OpenGL 2.0 as well? If I have an image that is 320 x 480 in size and want to draw it full screen is there any possible way to do this with OpenGL.
Thanks
NPOT textures are supported on PowerVR SGX hardware, but have restrictions. NPOT textures cannot use mipmaps, must be 2D (no cube-maps or 3D textures) and must use the GL_CLAMP_TO_EDGE for texture wrapping in both dimensions; this is supported by default in OpenGL ES 2.0 and under ES 1.1 by the extension GL_APPLE_texture_2D_limited_npot
For ES 1.1, you can check at runtime to see if this extension is present with this code:
const char* extensions = (char*) glGetString(GL_EXTENSIONS);
bool npot = strstr(extensions, "GL_APPLE_texture_2D_limited_npot") != 0;
Since this is only present on the SGX and not the MBX, be aware that relying on NPOT texture support will limit you to the newer SGX devices. Of course, relying on ES 2.0 will do the same, so if that's your intended target, NPOT support is a moot point and you can go ahead with NPOT textures.
Here's an alternate solution that lets you keep using ES 1.1 and retain full device support. Put the 320x480 texture inside a 512x512, fill the whitespace with other background tiles, glyphs, or other textures that will be drawn at the same time (to avoid multiple glBindTexture calls) and then use one of my favourite ES 1.1 extensions, GL_OES_draw_texture, to quickly copy the 320x480 section onto the viewport:
int rect[4] = {0, 0, 480, 320};
glBindTexture(GL_TEXTURE_2D, texBackground);
glTexParameteriv(GL_TEXTURE_2D, GL_TEXTURE_CROP_RECT_OES, rect);
glDrawTexiOES(0, 0, z, 480, 320);
Sidebar: The OpenGL ES 2.0 spec itself doesn't specify any restrictions on NPOT textures; unless I'm mistaken, Apple is imposing the limitations - of course, in the ES 1.1 world, NPOT support doesn't exist at all, so it's an addition there.
Assuming you don't have too many full-screen textures, you could just use a 512x512 texture and only use 320x480 of it. It will definitely work.
I guess that depends on the hardware. I used to create the closest power of 2 texture i.e if my texture is 320x480 then I will create a texture of 512x512 which will have the original texture data. this ensures portability but consumes a bit more memory ;)
Related
In apple's docs:(http://developer.apple.com/library/ios/#documentation/3DDrawing/Conceptual/OpenGLES_ProgrammingGuide/OpenGLESPlatforms/OpenGLESPlatforms.html)
it says that for "OpenGL ES 1.1 on the PowerVR SGX" "There are 8 texture units available."
it doesn't say how many units are available on OpenGL ES 2.0, does that mean there is no limit?
Rather than asking and getting an answer that may or may not be correct in the future, your app should be checking programmatically at runtime using something like this:
glGetIntegerv(GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS, &MaxTextureUnits);
Note that there are also separate numbers for the number of allowed texture units in a vertex shader and a fragment shader. They would use the constants GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS and GL_MAX_TEXTURE_IMAGE_UNITS. The COMBINED number is the number available to both at the same time.
There is a detailed listing of all the hardware across iPhones and iPads on Apple's iOS Device Compatibility Reference
Based on this, you are safe with using upto 8 Texture units on any iOS device.
Actually the answer is in same page you linked in question:
OpenGL ES 2.0 on the PowerVR SGX
Limits
...
You can use up to 8 textures in a fragment shader. You cannot use texture lookups in a vertex shader.
....
I want to implement image edition using OpenGL shaders. I have found some examples how to implement off-screen rendering using OpenGL ES1.
Do you now any example about off-screen rendering using OpenGL ES2 ans shaders on iPhone?
Thank you in advance
You need to use the framebuffer object extension (FBO), which is part of OpenGL ES2.
This is the same way as with OpenGL ES 1.0, except the functions lose their OES suffix (because FBO was an OES extension to ES1, not a part of the core).
You might like http://programming4.us/multimedia/3288.aspx this tutorial. The code is pretty simple and should be pretty easy to use it with GLES2.
My render to texture iPhone code only works if I disable MSAA, otherwise all I get is a black texture. What could be the cause of the problem?
Here is my code:
glViewport(0,0, target->_Width, target->_Height);
glClear(GL_COLOR_BUFFER_BIT Or GL_DEPTH_BUFFER_BIT Or GL_STENCIL_BUFFER_BIT);
glBindTexture(GL_TEXTURE_2D, target->_Handle);
// render stuff here
glCopyTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 0, 0, target->_Width, target->_Height, 0);
glBindTexture(GL_TEXTURE_2D, 0);
Apparently, when you are using MSAA for your main framebuffer, you have to use it for any other FBOs you want to render to as well. Since GL_TEXTURE_2D_MULTISAMPLE is not available on OpenGL ES 2, the solution I have found is quite simply to apply the same modifications you need to go from regular rendering to MSAA rendering, to your render-to-texture code as well.
You need 3 additional buffers: a multi-sampled color renderbuffer, a multi-sampled depth renderbuffer, and a new FBO to attach them to. Bind the new FBO instead of the texture FBO before rendering. After rendering, resolve the new MSAA FBO into the texture FBO, the same way you do in your main rendering code using glResolveMultisampleFramebufferAPPLE().
Note that for some reason, texture rendering with enabled MSAA works without these modifications in the simulator. Maybe it uses GL_TEXTURE_2D_MULTISAMPLE automatically?
The cause of your problem is that you cannot render to a multisampled texture in OpenGL ES. Indeed, if I recall correctly, multisampled textures don't exist in OpenGL ES. Desktop OpenGL allows you to do it, but it introduces a whole new texture target (GL_TEXTURE_2D_MULTISAMPLE) in order to do it.
A buffer that offers multisampling is not the same thing as a regular texture; that's why Desktop OpenGL uses a special texture target, which has its own GLSL sampler type.
I am working on an iPhone game, which will have many types of creeps, and each type of creep may have different colors, so I'm looking for the best way to do it, which so far seems to be palette swaping.
Is GL_EXT_paletted_texture available in OpenGLES (it is deprecated in OpenGL)? Since my game must support older devices (iPhone 3G) I can't use shaders, so I'm stuck with fixed pipeline.
How should I do palette swapping with OpenGLES on an iPhone?
OpenGL Color Index for iPhone's OpenGL ES 1.1?
It sounds that you can use glCompressedTexImage2D with GL_PALETTE4_RGB8_OES or GL_PALETTE8_RGBA8_OES. It would be possible to load texel data with various palette data.
Or you can use OpenGL ES 1.1 Texture Environments. Combine texture or constant color with proper environment.
iPhone 3D Programming - Chapter 8. Advanced Lighting and Texturing
I want to make metallic 3d object that appears to be reflective. I want to accomplish this using an environment shader that uses either a sphere or cube map that I can assign an image or texture as the "reflection" source.
Does OpenGL ES on the iPhone support this in any versions?
OpenGL ES 2.0 provides shader support. However, it isn't available in many mobile devices that are on the market today. It would be important for you to code both ES 1.1 and ES 2.0 versions of the graphics.
Apple Dev Center has tons of information on the transition:
The fixed-function pipeline of OpenGL
ES 1.1 provides good baseline behavior
for a 3D graphics pipeline, from
transforming and lighting vertices to
blending the final pixels with the
framebuffer. If you choose to
implement an OpenGL ES 2.0
application, you will need to
duplicate this functionality. On the
other hand, OpenGL ES 2.0 is more
flexible than OpenGL ES 1.1. Custom
vertex and fragment operations that
would be difficult or impossible to
implement using OpenGL ES 1.1 can be
trivially implemented with an OpenGL
ES 2.0 shader. Implementing a custom
operation in an OpenGL ES 1.1
application often requires multiple
rendering passes and complex changes
to OpenGL ES state that obscure the
intent of the code. As your algorithms
grow in complexity, shaders convey
those operations more clearly and
concisely and with better performance.
http://developer.apple.com/iphone/library/documentation/3DDrawing/Conceptual/OpenGLES_ProgrammingGuide/DeterminingOpenGLESCapabilities/DeterminingOpenGLESCapabilities.html#//apple_ref/doc/uid/TP40008793-CH102-SW1
In the old days "metallic" look was achieved using technique called "environment mapping" or "reflection mapping".
Since no programmable shaders are available for OpenGL ES 1.1, simple reflection mapping can be done with software. Just transform vertex normals according to reflection source/camera and get texture UV-coordinates from transformed normal vector. iPhone has horsepower to do this easily, at least with decent vertex counts.
OpenGL ES Supports most of the features of OpenGL (and some extra features for mobile devices). If I recall correctly the iPhone 3Gs supports fragment shaders, while the older iPhone 3G just supports a fixed pipeline.