How to mirror an image data by OpenGL ES on iOS - iphone

I'm developing a tool on jailbroken device.
I get a frame buffer by IOSurfaceGetBaseAddress, which give a mirrored data.
So i would like to mirror the image back. It's too slow to reverse it pixel by pixel, so i want to mirror it by OpenGL ES. Any one has this kind of sample code?

You can render the back side of texture.
Then, pixels is reversed on horizontal aspect.

Related

Pixelation issues on iPhone Air game

I am working on a cross-platform mobile game for Android and iOS devices. I am using Adobe Flash with AIR and AS3 to code the game. I am drawing my character, obstacles, and backgrounds in Adobe Illustrator. The canvas in Flash is set to 960x640. The character was intended to be 1/3 of the screen height so around 213 pixels high. I designed the character in Adobe Illustrator to be somewhere around 900 pixels high. When I imported the character into Flash I animated him, instantiated him using var player:Player = new Player(), and scaled him down to size using the scaleX and scaleY properties. I tested it out on the desktop and Android phone and it looked wonderful. However, when I tested it out on an iPhone, the player was unacceptably pixelated around the edges. I figured the fact that I drew the animation much larger than the intended height must be the problem, so I redrew the player to exactly 213 pixels high and retested on the iPhone without any improvement in the quality of the animation. I also tried converting the MovieClip to a Bitmap vector explained here but that also had no effect on the quality of the animation.
At this point, I am at a loss. Does anyone have any suggestions on how to avoid this pixelation issue that I am experiencing when going from Adobe Illustrator to Flash to the iPhone?
For a more optimised iPhone route you might want to consider creating your animation as a set of bitmap graphics, i.e. create them as png files using PhotoShop at the size you want them to be displayed at.
By doing this you'll save CPU activity in having Flash create smoothed bitmaps
Try to set smoothing to true for your bitmap.
yourbitmap.smoothing = true;
I have some solutions for your problems.
Character should not have dark outlines. Use gradients for better effects.
Color combination should understand for your character. which background you use and other things.

Is there a way to render pixels directly on iPhone?

I want to port a game I've made which renders the screen itself 50 fps (doesn't use opengl).
What is the best way to port this to the iPhone?
I was reading about Framebuffer Objects. Is this a good approach to render a buffer of pixels to the screen at high speeds?
The fastest way to get pixels on the screen is via OpenGL.
Need more info about how your game currently renders to the screen, but I don't see how FBOs will help as they're usually used for getting a copy of the render buffer, i.e. for creating a screen recording, or compositing custom textures on fly.
If i ever need to create an app where I have to access the pixels directly and dont have direct access to the hardware I use SDL as it just requires you to create a surface and from there you can manipulate the pixels directly. and as far as im aware you can use SDL on the Iphone, maybe even accelerate it using opengl too

antialiasing iPhone OpenGLES

I need in antialiasing in iPhone 3G (OpenGL ES1.1), NOT iPhone 3Gs with OpenGL ES.2.0.
I've draw 3d model and have next: pixels on the edges of the model look like teeth.
I've try set any filters for texture, but this filters making ONLY texture INSIDE look better.
How can i make good antialising ?
May be i should use any smooth for drawing triangles ? If yes, then how it possible in OpenGL ES1.1 ?
thanks.
As of iOS 4.0, full-screen anti-aliasing is directly supported via an Apple extension to OpenGL. The basic concept is similar to epatel's suggestion: render the scene onto a larger framebuffer, then copy that down to a screen-sized framebuffer, then copy that buffer to the screen. The difference is, instead of creating a texture and rendering it onto a quad, the copy/sample operation is performed by a single function call (specifically, glResolveMultisampleFramebufferAPPLE()).
For details on how to set up the buffers and modify your drawing code, you can read a tutorial on the Gando Games blog which is written for OpenGL ES 1.1; there is also a note on Apple's Developer Forums explaining the same thing.
Thanks to Bersaelor for pointing this out in another SO question.
You can render into a larger FBO and then use that as a texture on a square.
Have a look at this article for an explanation.
Check out the EGL_SAMPLE_BUFFERS and EGL_SAMPLES parameters to eglChooseConfig(), as well as glEnable(GL_MULTISAMPLE).
EDIT: Hrm, apparently you're out of luck, at least as far as standardized approaches go. As mentioned in that thread you can render to a large off-screen texture and scale to a smaller on-screen quad or jitter the view matrix several times.
We found another way to achieve this. If you edit your textures and add for example a 2 pixel frame of transparent pixels, the colored pixels in the texture are blended with the transparent pixels when necessary giving a basic anti-aliasing effect. You can read the full article here in our blog.
The advantage of this approach is that you are not rendering a bigger image, or copying a buffer, or even worse, making a texture from a buffer, so there is no impact in performance.

iPhone OpenGL ES 2d background texture

i have a 1024 x 1024 image I use for a texture in my game for the background.
Im wondering if their is a proper way to handle drawing a large background texture.
How I am doing it currently:
texCoord { 0,0,1,0,0,1,1,1 }
vertice { 0,0,0,height,width,0,width,height }
texCoordPointer(texCoord)
vertexPointer(vertice)
bind the texture
enable client (texCoordArr, vertexCoordArr)
drawArray
disable client (texCoordArr, vertexCoordArr)
That's fine...
I don't know if the GL|ES on the iPhone supports the glDrawTexOES extension, but if it does you may safe some lines of code. It won't make drawing any faster though.
Also some additional hints:
try to make the texture exactly as large as the screen. There is no need to store the image in 1024*1024 if the real resolution is more around 480*320. If you zoom or pan the image it's another thing of course.
You may save quite a bit of memory if you don't upload mipmaps for the backdrop.

iPhone camera images as OpenGL ES textures

Is it possible to use an image captured with the iPhone's camera as a texture that is then manipulated in OpenGL ES (flag wave effect, etc.)? The main problem being the size of the iPhone screen being 320x480 (no status bar) and thus the image won't have dimensions that are power-of-2. Is the main option copying it into a 512x512 texture and adjusting the vertices?
Yes, that's the way to do it.
Just use a larger texture. It's a waste of memory but unfortunately there is no way around this problem.
An alternative would be deviding the picture into squares with a length and height of 32 pixels (aka tiling), resulting into 15x8 tiles. Displaying it would however involve many texture switches while drawing which might become a bottleneck. On the other hand you would save a lot of memory using a tiled approach.