Getting Framerate Performance on iPhone - iphone

I've just come off the PSP where performance testing was easy. You just turned off 'vsync' and printed out the frameratem, then change something and see whether the frame rate goes up or down...
Is there any way to do the same thing on the iPhone? How do you turn vsync off? The Instruments tool is next to useless. Its chief problem being that it running it adversely affects the performance of the app! Also, the frame rate it reports is extremely sporadic.
I don't want any fancy tool that reports call trees and time spent in each function. I just want an unrestricted frame rate and some way to see what it is. Is there a high precision counter that you can use on the iPhone? Something like QueryPerformanceCounter in windows?
Also, is there anyway for you to somehow KILL backround processes so you know they can't effect the performance, perhaps solving the sporatic frame rate problem?

Profile your app with Instruments and use the Core Animation instrument. It gives a frame rate.

You're taking the try-something-and-measure approach. Very indirect. It's easy to tell exactly what is taking the time; it doesn't depend on what else is going on and doesn't require learning a new tool. All you need is a debugger that you can interrupt.

You can't kill background processes on the iPhone. That would make it possible for a buggy or malicious app to interfere with the phone function and the needs of all other functions on the iPhone are subordinated to the phone.

Try QuartzDebug or OpenGL Profiler.

Use instruments to get the frame rate.
To do this, run profile on your app (click and hold on the run button in xcode and choose profile). Make sure you are running your app on device. Choose openGL ES analysis. Look at the data display under core animation frames per second.
You want to aim for 60fps.

Related

Possibilities to reduce power consumption with cocos2d apps

I made a board game with includes just some little animations. I reduced the fps from 60 to 30 to reduce the processor load. But the device still gets very warm.
Another application made without cocos2d is not heating it so much.
Are there any methods to calm the iPhone down?
The device state is as follows:
Wifi is always enabled
The app uses gamecenter
GPS is inactive
fps is always on 30
I use cocos2d-iphone as engine
It might be worth experimenting with different director types, e.g. kCCDirectorTypeNSTimer, and seeing if that helps at all. Those will have the biggest effect on the main loop of the game.
You should also spend some time with Instruments if you've not already, as that will show you where the CPU is spending its time and give you some hints on where you could ease things up.
I've noticed that a sequence of small time animations in cocos2d takes a lot of processor time. I've tried making tips at the level which will pulse in size. 0.1 second pulse up, 0.15 down and 0.2 stay. And i've put it all in a repeat forever sequence. Everything was terribly slow. Then i've just made the animation manually and the device calmed down and fps increased back to 60
When showing menus or dialogs that do not require animation, you can actually lower your framerate even further.

cocos2d code slow down after some time

all
I am making a game in cocos2d, and I am moving an object from one place to another , throught CCTouchBegan , CCTouchMoved, CCTouchEneded (ccp function) and after that I take the action on it.
Any thoughts on why this code runs slow on device but fast on simulator in iphone.
Show us the code then we can say something particular.
But I think you just forgot to stop the action. [object stopAction];
or can use this method [self removeChild:(CCSprite*)sender cleanup:YES] It will also cleanup all running actions depending on the cleanup parameter
Code often runs slower on the device than it does in the simulator. The simulator is not accurate with respect to performance. In order to gauge how fast something executes, you have to try it on a device.
check your memory allocations.
am also having same problem. bcoz of memory management . now i solved.
check your memory leakage using performance tool in your xcode.

Why is my touch event frequency slower on iOS 4.0?

I'm writing a finger sketch type app and had the app working great on iOS 3.0. However, running the app under iOS 4.0 or greater is causing problems. Specifically, I am receiving touch events to my app approximately 5x slower on the new OS than on the old. This obviously causes my app to draw poorly because I'm capturing 5x less data points to draw between.
Any ideas on how to speed up touch event frequency on iOS4?
Certain graphics operations take a lot longer in iOS4.x than in 3.x (somewhere between 2X to 10X slower). The longer graphics execution times could be blocking the main UI thread, and not leaving enough time for the main thread run loop to handle user (touch) events.
Profile your drawRect code and see if it's now taking longer that one refresh rate tick time. If so, try speeding up or breaking up your graphics renders, or try a lower frame rate, and see if any of that helps.
You should profile your code with Shark to see where the bottleneck is. What is probably happening is some of your code that executes on the main thread (most likely something that runs in response to a touch) is taking longer than expected which is preventing your app from receiving touch events
Block the main thread less.

Can the exposure time be manually adjusted for an iOS cameras?

I want to adjust the exposure of the iPhone/iPod touch camera with intimate detail. I would prefer to take a series of photos with decreasing exposure times to obtain a sequence of images (for HDR reconstruction). Is this possible?
If not, what's the next best thing? It seems you can set a point of interest in the image for the autoexposure. Perhaps I could search for a dark/light region of the image and then use this exposurePointOfInterest to adjust the exposure, but this seems like a very indirect solution that is also error-prone. If anybody has tried an alternative, such an answer is also desirable.
As iOS gives control of frame durations by
MinFrameDuration
MaxFrameDuration
since exposure times vary based on fram rate and frame duration
By setting min and max frame rate to a particular value
You will be locking the fram rate.
That will effect your exposure times.
This is also very indirect way of controlling, may be it helps your case
some example would be like this:
if (conn.isVideoMinFrameDurationSupported)
conn.videoMinFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
if (conn.isVideoMaxFrameDurationSupported)
conn.videoMaxFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
Since you would have to decrease the shutter speed of the camera, this unfortunately does not appear to be possible, and more importantly, against the HIG:
Changing the behavior of iPhone external hardware is a violation of
the iPhone Developer Program License Agreement. Applications must
adhere to the iPhone Human Interface Guidelines as outlined in the
iPhone Developer Program License Agreement section 3.3.7
Related article Apple Removes Camera+ iPhone App From The App Store After Developer Reveals Hack To Enable Hidden Feature.
If it can be done programatically, instead of with the hardware, you might have a chance, but then its just an effect on an image,not a true long exposure picture.
There are some simulated slow shutter apps that do get approved like Slow Shutter or Magic Shutter.
Related article: New iPhone Camera App “Magic Shutter” Hits The App Store.
This is supported since iOS 8:
http://developer.xamarin.com/guides/ios/platform_features/intro_to_manual_camera_controls/
Have a look at AVCaptureExposureModeCustom and CaptureDevice.LockExposure...
I tried to do this for my motion activated camera app (Pocket Sentry) and I found that it is not possible to do this AND get approved in the app store.
I have been trying to do this myself. I think its possible only by using the exposure point of interest property. I am detecting the dark and bright spots and then adjusting the point accordingly.
Please refer : Detecting bright/dark points on iPhone screen
Does anyone know a better way to do this?
I am not too sure, but you should try using AVFoundation class to build the camera app, following the apple's sample code:
AVCam Sample Code
And then try to leverage the exposureMode property of the Class:
exposureMode Class Reference

My app closes without any warning or error message

I'm programming an puzzle game for iPhone using openGL.
There Is one very weird "bug" ( I'm not sure what It is)... whenever I touch the screen a great number of times in a short period of time my app closes, without giving a warning or error.
What could be the cause ?, I guess It has something to do with the memory, but I would like to know.
Edit:
I also think this happens because I'm calling multiple functions every time the user touches the screen or moves his fingers.
Sounds like you're running out of memory.
A few quick tips that might help out:
Check your memory profile over time using Instruments. If you see a steady incline over time, it's likely to be a memory leak, or an inefficient algorithm that is allocating more memory than you need.
Use a static analyzer to help check for leaks, such as Clang.
Images and image-related files are particularly memory-hungry, so focus on efficiency for them. When you work with textures in OpenGL, use the PVRTC format, which offers awesome compression.
didReceiveMemoryWarning: is your friend - aka a good chance to throw out anything you don't absolutely need in memory. Better to be memory-efficient the whole time, though.
Try setting up an NSUncaughtExceptionHandler. You may also want to setup a signal handler.