iOS frame-by-frame animation with Quartz - iphone

I'm working on an iPhone game that involves only two dimensional, translation-based animation of just one object. This object is subclassed from UIView and drawn with Quartz-2D. The translation is currently put into effect by an NSTimer that ticks each frame and tells the UIView to change its location.
However, there is some fairly complex math that goes behind determining where the UIView should move during the next frame. Testing the game on the iOS simulator works fine, but when testing on an iPhone it definitely seems to be skipping frames.
My question is this: is my method of translating the view frame by frame simply a bad method? I know OpenGL is more typically used for games, but it seems a shame to set up OpenGL for such a simple animation. Nonetheless, is it worth the hassle?

It's hard to say without knowing what kind of complex math is going on to calculate the translations. Using OpenGL for this only makes sense if the GPU is really the bottleneck. I would suspect that this is not the case, but you have to test which parts are causing the skipped frames.
Generally, UIView and CALayer are implemented on top of OpenGL, so animating the translation of a UIView already makes use of the GPU.
As an aside, using CADisplayLink instead of NSTimer would probably be better for a game loop.

The problem with the iPhone simulator is it has access to the same resources as your mac. Your macs ram, video card etc. What I would suggest doing is opening instruments.app that comes with the iPhone SDK, and using the CoreAnimation template to have a look at how your resources are being managed. You could also look at allocations to see if its something hogging ram. CPU could also help.
tl;dr iPhone sim uses your macs ram and GFX card. Try looking at the sequence in Instruments to see if theres some lag.

Related

Game Timers Slowing Down Overall Performance

I'm working on an iPhone App where I rely heavily on timers and animations, but I've realized my game is really slowing down and lagging on certain aspects of the game. I'm not quite sure how to improve this without removing any animations or anything.
Essentially what I'm using is the accelerometer to update my character's position (Left/Right). I also use several timers to read from different URLs, update images and the one that lags the most is one that loops an image to move from left to right.
Basically I'm using about 6 or 7 timers and the Accelerometer, is there a way I can improve the performance of my game without having to remove any of my animations or changing the interval of the timers?
Thanks in advance!
Try use Allocation
RUN-> RUN WITH PERFORMANCE TOOLS->Allocation
to see which part is overload in the application
Instead of using NSTimer to animate, you might want to look at using the animationImages property in UIImageView (which lets you show multiple frames of animation, possibly looping) or beginAnimations within UIView (which lets you move an object along a path, among other things).
A good place to start is with this question on recommend reading for iPhone animation, although for reference the documentation in XCode and Apple's programming guides are good sources by themselves.
What others have suggested - in terms of using a game loop - make sense, too, but depending on the complexity of your game, sometimes just letting the iOS SDK take care of your animations for you can be good enough.

iPhone demo help: anyone know of a faster screen capture alternative to UIGetScreenImage()?

I'm working on an iPhone app that I'm going to be demo'ing to a live audience soon.
I'd really like to demo the app live over VGA to a projector, rather than show screenshots.
I bought a VGA adapter for iPhone, and have adapted Rob Terrell's TVOutManager to suit my needs. Unfortunately, the frame rate after testing on my television at home just isn't that good - even on an iPhone 4 (perhaps 4-5 frames per second, it varies).
I believe the reason for this slowness is that the main routine I'm using to capture the device's screen (which is then being displayed on an external display) is UIGetScreenImage(). This routine, which is no longer allowed to be part of shipping apps, is actually quite slow. Here's the code I'm using to capture the screen (FYI mirrorView is a UIImageView):
CGImageRef cgScreen = UIGetScreenImage();
self.mirrorView.image = [UIImage imageWithCGImage:cgScreen];
CGImageRelease(cgScreen);
Is there a faster method I can use to capture the iPhone's screen and achieve a better frame rate (shooting for 20+ fps)? It doesn't need to pass Apple's app review - this demo code won't be in the shipping app. If anyone knows of any faster private APIs, I'd really appreciate the help!
Also, the above code is being executed using a repeating NSTimer which fires every 1.0/desiredFrameRate seconds (currently every 0.1 seconds). I'm wondering if instead wrapping those calls in a block and using GCD or an NSOperationQueue might be more efficient than having the NSTimer invoke my updateTVOut obj-c method that currently contains those calls. Would appreciate some input on that too - some searching seems to indicate that obj-c message sending is somewhat slow compared to other operations.
Finally, as you can see above, the CGImageRef that UIGetScreenImage() returns is being turned into a UIImage and then that UIImage is being passed to a UIImageView, which is probably resizing the image on the fly. I'm wondering if the resizing might be slowing things down even more. Ideas of how to do this faster?
Have you looked at Apple's recommended alternatives to UIGetScreenImage? From the "Notice regarding UIGetScreenImage()" thread:
Applications using UIGetScreenImage() to capture images from the camera should instead use AVCaptureSession and related classes in the AV Foundation Framework. For an example, see Technical Q&A QA1702, "How to capture video frames from the camera as images using AV Foundation". Note that use of AVCaptureSession is supported in iOS4 and above only.
Applications using UIGetScreenImage() to capture the contents of interface views and layers should instead use the -renderInContext: method of CALayer in the QuartzCore framework. For an example, see Technical Q&A QA1703, "Screen capture in UIKit applications".
Applications using UIGetScreenImage() to capture the contents of OpenGL ES based views and layers should instead use the glReadPixels() function to obtain pixel data. For an example, see Technical Q&A QA1704, "OpenGL ES View Snapshot".
New solution: get an iPad 2 and mirror the output! :)
I don't know how fast is this but it worth a try ;)
CGImageRef screenshot = [[UIApplication sharedApplication] _createDefaultImageSnapshot];
[myVGAView.layer setContents:(id)screenshot];
where _createDefaultImageSnapshot is a private API. (Since is for demo... its ok I suppose)
and myVGAView is a normal UIView.
If you get CGImageRefs then just pass them to the contents of a layer, its lighter and should be a little bit faster (but just a little bit ;) )
I haven't got the solution you want (simulating video mirroring) but you can move your views to the external display. This is what I do and there is no appreciable impact on the frame rate. However, obviously since the view is no longer on the device's screen you can no longer directly interact with it or see it. If you have something like a game controlled with the accelerometer this shouldn't be a problem, however something touch based will require some work. What I do is have an alternative view on the device when the primary view is external. For me this is a 2D control view to "command" the normal 3D view. If you have a game you could perhaps create an alternative input view to control the game with (virtual buttons/joystick etc.) really depends on what you have as to how to work around it best.
Not having jailbroken myself I can't say for sure but I am under the impression that a jailbroken device can essentially enable video mirroring (like they use at the apple demos...). If true, that is likely your easiest route if all you want is a demo.

mixing OpenGL and Interface Builder/ UI Controls - bad idea? Why? (iPhone)

I've heard that OpenGL ES and standard iPhone UI controls don't play well together, but I'm wondering if anyone knows why, and what the effects are? I'm writing an OpenGL based game, and the view is loaded from a nib file with ui controls, and it seems to work ok, but the game is really simple at this point... does using ui controls cause some kind of performance hit?
UI events momentarily pause timers, like when scrolling a tableview. You can get around this by using the common runtime mode when creating a timer. It may slow down your rendering if you have a lot of layers because they all need to get redrawn every-time you refresh. So if your game runs at 60fps it will also redraw everything on top of the GLView, like UIImageViews, buttons etc. 60 times a second, which is a huge waste. It might not make a huge impact on your frame rate but it may make the device run hotter and drain the battery faster. Its best to draw your HUD using OpenGL, but it depends on the situation. For something that will be displayed only for a short time, like a menu I think you can get away with it.
Theres nothing wrong with it, its just wasteful.

iPhone. I need to touch every fullscreen pixel at 30fps. Doable?

I am interested in doing some image-hacking apps. To get a better sense of expected performance can someone give me some idea of the overhead of touching each pixel at fullscreen resolution?
Typical use case: The use pulls a photo out of the Photo Album, selects a visual effect and - unlike a Photoshop filter - gestural manipulation of the device drives the effect in realtime.
I'm just looking for ballpark performace numbers here. Obviously, the more compute intensive my effect the more lag I can expect.
Cheers,
Doug
You will need to know OpenGL well to do this. The iPhone OpenGL ES hardware has a distinct advantage over many desktop systems in that there is only one place for memory - so textures don't really need to be 'uploaded to the card'. There are ways to access the memory of a texture pretty well directly.
The 3GS has a much faster OpenGL stack than the 3G, you will need to try it on the 3GS/equivalent touch.
Also compile and run the GLImageProcessing example code.
One thing that will make a big difference is if you're going to do this at device resolution or at the resolution of the photo itself. Typically, photos transferred from iTunes are scaled to 640x480 (4 times the number of pixels as the screen). Pictures from the camera roll will be larger than that - up to 3Mpix for 3GS photos.
I've only played around with this a little bit, but doing it the obvious way - i.e. a CGImage backed by an array in your code - you could see in the range of 5-10 FPS. If you want something more responsive than that, you'll have to come up with a more-creative solution. Maybe map the image as textures on a grid of points, and render with OpenGL?
Look up FaceGoo in the App Store. That's an example of an app that uses a straightforward OpenGL rendering loop to do something similar to what you're talking about.
Not doable, not with the current APIs and a generic image filter. Currently you can only access the screen through OpenGL or higher abstractions and OpenGL is not much suited to framebuffer operations. (Certainly not the OpenGL ES implementation on iPhone.) If you change the image every frame you have to upload new textures, which is too expensive. In my opinion the only solution is to do the effects on the GPU, using OpenGL operations on the texture.
My answer is just wait a litle until they get rid of the openGL 1.0 devices and finally bring Core Image over to the iphone SDK.
With Fragment shaders this is very doable on the newer devices.
I'm beginning to think the only way to pull this off is to write a suite of vertex/fragment shaders and do it all in OpenGL ES 2.0. I'd prefer not to incur the restriction of limiting the app to iPhone 3GS but I think thats the only viable way to go here.
I was really hoping there was some CoreGraphics approach that would work but that does not appear to be the case.
Thanks,
Doug

How can I programmatically determine CPU usage rate or how busy / occupied the system is in iPhone-OS?

My app is doing some pretty but heavy weight core animations during scrolling. Sometimes it crashes due to bad performance. So I need some way to find out if there is enough capability to make the animations, and if not, I just leave them away. Best way would be if I could ask the system how busy it is.
UPDATE: I mean especially Core Animation.
By animation, do you mean frames that play after one another (like an animated GIF) or some CoreAnimation (OpenGL) effect that is moving polygons with mapped textures around?
If it's the former, I'd really consider some way of optimizing the animation or eliminating it in all cases.
If it's the latter, I'd do some deeper digging into the source of the problem. Core Animation under normal circumstances will drop frames in order to keep from getting into situations like this in the first place.
In either case case you might consider loading the texture assets a little earlier. I have had some trouble in my apps with animation methods that take a UIImage parameter when I created the UIImage in the function call. Preloading the asset a little earlier in my code took care of the problem nicely.
As an example:
BAD
[[UIImage imageNamed:#"checkmark.png"] drawAtPoint:p];
BETTER
//declared at top of class
static UIImage *checkmark = nil;
in init:
checkmark = [UIImage imageNamed:#"checkmark.png"];
in drawRect:
[checkmark drawAtPoint:p];
You would need to adapt this technique to your particular situation. In my case, checkmark is used often, and it quite small, I don't mind dedicating the memory to it permanently.
I wonder if your crashes could be fixed by making sure the assets were ready to be used by the application.
I wouldn't do that. If your app crashes, it's to heavy. You could run your app with some instruments to see where your bottlenecks are.
So, without trying to sound too harsh, your best way is to rewrite some parts in order to make you app run on an iPhone at all times.