How to create a distorted screen - iphone

I want to show distorted image as error page for my application. If possible this can be a screenshot of home screen with some graphics distortion. Is this possible.
Thanks You.

As Daniel A. White's comment mentions, this probably will cause your application to be rejected from the App Store, but it can be accomplished in many ways. I think this technique would be acceptable if your own interface appeared broken, but not acceptable if you made any iOS supplied looks appear broken.
You could just use your favorite image editor (i.e. Photoshop) to distort a screen shot, and displayed it by putting it in a separate UIView. The image would be static. It couldn't react to the contents of your program's interface.
If your interface is drawn with OpenGL ES 2.0, you could draw your regular interface to a texture, then use that texture as input to another GLSL program that applied the distortion.

Related

Usage of "Don't Clear" in "Clear Flags" property of Camera

In Unity's Camera component, there is a property Clear Flags which allows to choose from four options: Skybox, Solid Color, Depth Only and Don't Clear.
As documentation says:
Don’t clear
This mode does not clear either the color or the depth buffer. The
result is that each frame is drawn over the next, resulting in a
smear-looking effect. This isn’t typically used in games, and would
more likely be used with a custom shader.
Note that on some GPUs (mostly mobile GPUs), not clearing the screen
might result in the contents of it being undefined in the next frame.
On some systems, the screen may contain the previous frame image, a
solid black screen, or random colored pixels.
"This isn't typically used in games and would more likely be used with a custom shader"
So my question is :
How to use it in a custom shader and What effects can be achieved by using it?
Has anyone ever used it or has a good explanation about the basic concept.
Thanks
An idea would be those enemy encounter effects in Final Fantasy games. Look at the top edge of this gif to see the smearing effects of previous frames. This is probably combined with blur/rotation.
Thread question is a bit old, however I had this problem and solved it.
I've made a Screen Image Effect that reproduces this effect, you can see it here:
https://github.com/falconmick/ClearFlagsMobile
Hope this helps!

Coding a photoshoped GUI for iOS devices

well... I have searched for a while on topics of coding given GUI elements by designer in photoshop format. But I have a really hard time getting it together. Just for an example. When I would like to make an app with only a simple LCD-Display with a timer, counting down, how would I start there..... Don't get me wrong, I am aware of doing the code behind the scenes to make the timer count etc.
But what about setting up a nice looking gui with glossy display effect? What is a "correct way" to implement such a gui? Taking a Photoshop file showing a glossy display and setting a UILabel on that? or coding the gloss effect programmatically?
This is just one example... hm... I do not find good ressources for getting a start on such a topic. I would be really gladful if you could give me a helping hand for a start.
In the typical app development cycle, you would have the graphics people delivering graphics to the programming people, in the form of PNG files.
However, it is very well possible to render all kinds of things on the fly on the device. The blue shade on the tab bar icons in any app using UITabBarController is a clear example: the programmer puts in a PNG with just the alpha channel, and the system renders the blue shading.
Using Quartz Core (look for CGContext in the documentation) you can draw lines and text, and apply all kinds of transformations, gradients, clipping paths, etc. Using this you can create your own styled subclasses of UIView and such.
The PNG approach is generally the easier way.

Is there a way to render pixels directly on iPhone?

I want to port a game I've made which renders the screen itself 50 fps (doesn't use opengl).
What is the best way to port this to the iPhone?
I was reading about Framebuffer Objects. Is this a good approach to render a buffer of pixels to the screen at high speeds?
The fastest way to get pixels on the screen is via OpenGL.
Need more info about how your game currently renders to the screen, but I don't see how FBOs will help as they're usually used for getting a copy of the render buffer, i.e. for creating a screen recording, or compositing custom textures on fly.
If i ever need to create an app where I have to access the pixels directly and dont have direct access to the hardware I use SDL as it just requires you to create a surface and from there you can manipulate the pixels directly. and as far as im aware you can use SDL on the Iphone, maybe even accelerate it using opengl too

iPhone UIImage overlap render bug

I've come across a strange render bug on iPhone OS 3.0...
I have two images. One is a non-transparent PNG that is predominately black with a white gradient fading upward.
The second is a transparent PNG with translucent clouds.
When I overlay the two using UIImageView, the intersection of the clouds and white gradient triggers a render bug that causes a rather odd looking graphical glitch that removes all opacity from the image on top (in this case the clouds), and causes the glitched portion of the image to render on top of all layers in the current view (including ones it is technically underneath).
It only occurs at the intersection of the two portions of the images. So typically only a very small block is experiencing the error while the rest of the images render normally.
Has anyone seen this and does anyone have a fix? I want to check before I move on to Core Animation which will hopefully address the problem (since I imagine that CA or even OpenGL is more apt to handle overlapping alpha channels).
Screenshot found here:
http://www.jasconi.us/glitch.jpg
You can see the intersect of the two images at the lower right.
From your description, this seems to be a bug in Apple's code. I would report it to Apple and wait for a fix.
In the meantime, you can try to implement the same functionality in Core Animation or OpenGL in the hope that the bug is in the higher-level UIImageView, but since the UIImageView itself uses Core Animation, it's possible that this bug is simply unavoidable until it's fixed.
I assume you're displaying them using UIImageView? If so, have you set opaque to NO on the transparent view?

Performance and background images for OpenGL ES/iPhone

I'm developing a 2D game for the iPhone using OpenGL ES and I'd like to use a 320x480 bitmapped image as a persistent background.
My first thought was to create a 320x480 quad and then map a texture onto it that represents the background. So... I created a 512x512 texture with a 320x480 image on it. Then I mapped that to the 320x480 quad.
I draw this background every frame and then draw animated sprites on top of it. This works fine except that the drawing of all of these objects (background + sprites) is too slow.
I did some testing and discovered that my slowdown is in the pixel pipeline. Not surprisingly, the large background image is the main culprit. To prove this, I removed the background draw and everything else rendered very fast.
I am looking for advice on how to keep my background and also improve performance.
Here's some more info:
1) I am currently testing on the Simulator (still waiting on Apple for the license)
2) The background is a PVR texture squeezed down to 128k
3) I had hoped that there might be a way to cache this background into a color buffer but haven't had any luck with that. that may be due to my inexperience with OpenGL ES or it just might be a stupid idea that won't work :)
4) I realize that the entire background does not always have to refresh, just the parts that have been drawn over by the moving sprites. I started to look into techniques for refreshing (as necessary) parts of the the background either as separate textures or with a scissor box, however this seems less than elegant.
Any tips/advice would be greatly appreciated...
Thank you.
Do not do performance testing on the simulator. Ever!
The differences to the real hardware are huge. In both directions.
If you draw the background every frame:
Do not clear the framebuffer. The background will overdraw the whole thing anyway.
Do you really need a background texture ?
What about using a color gradient via vertex colors ?
Try using the 2bit mode for the texture.
Turn of all render steps that you do not need for the background.
E.g.: Lighting, Blending, Depth-Test, ...
If you could post some of your drawing code it would be a lot easier to help you.
If you're making a 2D game, is there any reason you aren't using an existing library? Specifically, the cocos2d for iPhone may be worth your time. I can't answer your question about how to fix the issue doing it all yourself, but I can say that I've done exactly what you're talking about (having one full screen background with sprites on top) with cocos2d and it works great. (Assuming 60 fps is fast enough for you.) You may have your reasons for doing it yourself, but if you can, I would highly suggest at least doing a quick prototype with cocos2d and seeing if that doesn't help you along. (Details and source for the iPhone version are here: http://code.google.com/p/cocos2d-iphone/)
Thanks to everyone who provided info on this. All of the advice helped out in one way or another.
However, I wanted to make it clear that the main issue here turned out to be the behavior of simulator itself (as implied by Andreas in his response). Once I was able to get the application on the device, it performed much, much better. I mention this because, prior to developing my game, I had seen a lot of posts that indicated that the device was much slower than the simulator. This might be true in some instances (e.g. general application logic) but in my experience, animation (particularly 3d transformations) are much faster on the device.
I dont have much experience with OpenGL ES, but this problem occurs generally.
Your idea about the 'color buffer' is good intuition, essentially you want to be storing your background as a frame buffer and loading it directly onto your rendering buffer before drawing the foreground.
In OpenGL this is fairly straight forward with Frame Buffer Objects (FBO's). Unfortunatly I dont think OpenGL ES supports them, but it might give you somewhere to start looking.
you may want to try using VBOs (Vertex Buffer Objects) and see if that speeds up things. Tutorial is here
In addition, I just saw, that since OpenGL ES v1.1, there is a function called glDrawTex (Draw Texture) that is designed for
fast rendering of background paintings, bitmapped font glyphs, and 2D framing elements in games
You could use frame buffer objects similar to the GLPaint example from Apple.
Use a texture atlas to minimize the number of draw calls you make. You can use glTexCoordPointer for setting your texture coordinates that maps each image to its correct position. Remember to set your vertex buffer too. Ideally one draw call will render your entire 2D scene.
Avoid enabling/disabling states where possible.