iPhone 4(S) vs iPad2 computer vision performance problems - iphone

We have this project using OpenCV and at first we developed for iPad2.
Everything ran smooth and an computer vision object recognition iteration was taking a little under 1 second.
So far so good. Now we are testing the app for iPhone on both 4 and 4S. Of course we did our research as the results were stating the iPhone4S performance was almost as fast as the iPad2.
The results of the iPhone 4 are terrible, one iteration takes 15 seconds. In the iPhone 4S on iteration takes 8 seconds.
So with our algorithms:
iPhone4 is 15x slower than an iPad2
iPhone4S is 7-8x slower than an iPad2
Does anybody know if this is true? Is there something the iPhone is doing differently than the iPad2? Isn't the processor of the same type?
Anybody who can point us in the right direction?

Similar processors, different camera's. The iPad's camera has less MP's than the iPhone's one. On the iPhone, make sure you downsize your image to an acceptable size before processing.

Related

cocos2d 2.x iphone 4 frame rate drop

I recently ported a project from Cocos2d 1.1 to 2.0. So far everything is working great, but I've noticed that the frame rate drops from 60 fps to around 40-50 fps on the iPhone 4 and the iPod Touch 4. Other devices I have tried (iPads 1 & 3, iPhone 4s) still run at 60.
At first I thought I had too many draw calls, but after reducing the calls from 54 to 17, I still had the same fps on both devices. I should note that I don't have this problem on the default Hello World template, nor do I have any openGL errors in the console. My memory footprint is around 50mb, so I don't think that's the problem either. Any ideas?
Looks like you have a simple performance problem. The other devices running at 60 fps (at least iPad 3 and iPhone 4S) have more computing power and better graphics processors than the iPhone 4 and iPod Touch 4.
Since this happened after upgrading from cocos2d 1.1 to 2.0, there's the possibility that the 4th generation iPhone devices don't run OpenGL ES 2.0 code (shaders) as fast as the other devices. It could also be an issue with cocos2d 2.0 itself. Unless you really need to use a feature of cocos2d 2.0 (ie shaders) the easiest approach would be to simply go back to cocos2d 1.1. If you aren't writing shaders, cocos2d v1.1 is just the same as 2.0 in terms of features (at this point in time).
Finally, test performance only with release builds. If you run Debug builds things like logging and assertions can totally skew any performance results.

Color-depth in iPhone simulator

I made a promotional video for my Exoplanet app and just noticed that the color-depth on the iPhone simulator is much lower than on the physical device. The visualizations look splendid on the actual phone but rather cheap on the simulator. I mainly notice this when using textures in OpenGL and CoreAnimation.
Have a look at this short video to see what I mean. (It has nothing to do with the recording.)
Am I doing something wrong? Can I increase the color-depth of the simulator?

Speeding up full screen image drawing on iPhone 3G

I am developing an app which plays interactive real time streaming video. I use FFMPEG (don't worry, I'll be releasing my source code) to decode a MPEG2/H264 RTP stream. I simply cannot get the iPhone 3G to draw a screen full of pixels faster than 5 times per second.
I've tried a OpenGL texture which was just a slow. I've also tried an array of 2D vertexes covering the entire screen and using glDrawArrays but that yielded 5 FPS as well. For now I've stuck to simply drawing a CGImage onto my view which gives me about 7-8 FPS.
From what I gathered, the private CoreSurface framework seems to be the only way. Anyone have any tips or tricks to get at least 20-30 FPS? I'd hate to restrict my app to only the 3GS and iPod touches.
Thanks,
Andrew
If you want to play H.264 and MPEG2 video on the iPhone, why are you doing the decoding yourself? What do you need that MPMoviePlayer doesn't cover?
If you are decoding in software, you're never going to get a good frame rate. Even on faster hardware (iPhone 3GS and 2nd/3rd gen iPod touches), you're going to drain the battery very quickly.

Can the iPhone simulator handle PVR textures?

I have a really weird problem with PVR textures on the iPhone simulator- the framerate falls through the floor on the iPhone simulator, but on the iPhone itself it works just fine. Has anyone had any experiences similar to this? I'm using SDK 3.1.2
The iPhone simulator is known to be extremely slow for certain rendering scenarios.
One especially bad case we experienced was that creating (glGenTextures+glTexImage2D) and destroying (glDeleteTextures) a big texture (title screen) will kill all performance until a hard simulator restart.
Testing your "logic" on the simulator and "assuming" it will be fast on the device has been working pretty good for us.
As always: Try avoiding state changes!
And yes: PVR textures are supported, but simulated in the shader.
Yes, simulator supports PVRTC textures, but probably does decompression into RGB format during upload or when texture is used for the first time. In my game it causes big slowdown until every PVR texture is shown at least once. Of course, those slowdowns appear in simulator only and on actual device PVRTC textures are very very fast.
The iPhone simulator is notorious about being almost like the iPhone.
I cannot speak about PVR textures, as I am not sure. But, from other things I have done (and from what I read on the internet) most developers give up on the simulator rather quickly due to its minor differences from the real thing.
In the end if it works on the iPhone, then the simulator doesn't matter.

iPhone device vs. iPhone simulator

I have heard of apps not working properly on the simulator but working properly on the actual iPhone device. Has anyone experienced an app that runs perfectly in the simulator but not on the actual iPhone device?
Filenames are case-sensitive on the iPhone, but not in the simulator.
So, for example, if you try to load an image with UIImage *iconImage = [UIImage imageNamed:"MyIcon.png"], but your resource is actually named "myicon.png", then it will work on the simulator, but not on the device.
If your app is graphics intensive, like say a game, the performance of the simulator DOES NOT resemble at all that of the hardware. Your application will probably be smooth and work great on the simulator, but on hardware it'll likely render at a crawl unless you know what your doing. You can easily go from 60fps to 3fps between Simulator and hardware.
The order in which function/constructor parameters are evaluated is different:
int i = 0;
int f() { return ++i; }
int a, b;
int test(int p1, int p2) {
a = p1;
b = p2;
}
test( f(), f() );
//simulator: a = 2, b = 1
//device: a = 1, b = 2
Trigonometry functions may return different results:
float a = cosf( 0.108271248639 );
printf("%.12f", a);
//simulator: 0.994144439697
//device: 0.994144380093
I know there are some differences in the OpenGL ES implementation between the device and the simulator. From what I understand this is mainly because of the graphics chip on the iPhone (PowerVR MBX) having vastly different capabilities than other mac machines. Many of the hardware limits are not enforced in the simulator, so it is entirely possible to get something running in the simulator that will totally crash on the device.
There are also some OpenGL ES extensions that are supported by the iPhone hardware that are not supported in the simulator. I believe the major one is PowerVR texture compression (PVRTC).
Another problem area can be memory footprint. Anecdotally, I have not seen the simulator automatically enforce the memory limitations of the device. Therefore, it is possible to have something that runs in the simulator fine, happily consuming copious amounts of RAM and never bothering to free any of it only to be swiftly terminated after a short continuance of such behaviour when running on a device.
There are certain bits of code that won't work on the simulator (using the iPhone Keychain, for example), but for almost all applications, the simulator will work exactly like the iPhone.
That said, there's absolutely no replacement for testing on a real device.
I had a problem running a relatively simple 1/30 sec timer to do updates for a game. It runs fine in the simulator, and freezes out input on the device.
Also note that you will be compiling against the OS X frameworks (where applicable) when building for the simulator so you could be using methods and classes that aren't available on the iPhone versions of the frameworks.
One example I can think of off the top of my head is NSPredicate. I was able to compile and run an app using NSPredicate in the simulator, but it wouldn't compile for the device since that class isn't available.
Fingertips are larger than the 1 pixel endpoint of a mouse cursor. To do proper, even minimal, usability testing you should deploy your App to a device.
If you enable GCC_ENABLE_FLOATING_POINT_LIBRARY_CALLS, your app will crash all over the place in the simulator but work on the iPhone.
Quartz graphics calls in the iPhone simulator are faster than Java2D calls on the same computer--wicked fast.
I've had issues in memory-hungry applications where the Simulator would work just fine (because it would assume the iPhone/iPod Touch's memory was all yours to play with), while the device would crash (because other apps had leaked and Apple background services had eaten up some memory) and I hadn't implemented proper memory management or a response to the didReceiveMemoryWarning selector.
One big thing that took me a while to spot was that the simulator does not support device tokens, so any code that involves those will not run on the simulator.
I had a bug where the app would work fine on the simulator, but would crash when I ran it on a device because there was a bug in the device token code. I couldn't figure it out for the longest time!
There are many trivial examples. For example you can allocate far more memory on the simulator than on a real phone. You cannot receive push notification on the simulator if you don't have a retina Mac, the display dot pitch is different.
At a more fundamental level, the simulator is just that, it simulates the iPhone OS X using Mac OS X. This is evidenced by the fact that the filesystem on the simulator may not be case sensitive but on the phone it will be. More subtly, it does not emulate hardware so things like location will not work the same and 3D is going to be very different - especially if you are using Metal.
You should always test on real hardware.
Without considering the performance differences between the two, there used to be some things that the simulator didn't do correctly - i.e. it would mess up audio in some cases (see this question). However since the 2.2 SDK this issue has been resolved and the sound seems to be fine in the simulator. That's not to say that there is some other incompatibilities lurking down there! (Just none I've run into)
Regarding sounds, I was having the same problem. The issue was that the sound encodings that the Simulator supports is a different set of sounds than the device. I hope that helps.
I had many problems with libraries and frameworks when moving from the simulator to the device. Not least that they seem to have different architectures!
I have seen the positioning of objects, like toolbars be different on simulator than on the phone. Very annoying.
Yeah....
Apps compiled for 2.x will work fine on 3.0 device, but it will crash on 3.0Simulator
Note: 1. If you compile for 3.0, app will work fine on 3.0 simulator also...
2. a)Compile for 2.x and launch the app in simulator.
b)Now change the iPhone Simulator Hardware to "3.0".
c)Then launch the app we installed earlier in step a).
CRASH !!!!!!!!
movie file (m4v type) as my exprience is first time playing properly
but at second time it flickers screen of simulator...
whereas in iPhone device it works fime...
I had some sound effects that played fine in the simulator, but not on the device. I had to change the format to something that the device would handle.
If status bar of application is hidden,In case of simulator it still consumes touch event. But in the device it behaves perfectly.
Yes - it happened to me the other day. I'm new to the iphone, and so had deleted MainWindow.xib thinking it was unused. The app worked perfectly on the simulator - but crashed when installing on the phone.
Another issue we ran into was our three20 dependencies, which were set to ios 3.2 instead of 4.1. Worked perfectly in the simulator, but bombed on the device (since the files were compiled for the wrong arch).
iphone video library is not accessible
in simulator but code work fine on
actual device
Resource loading in the simulator is MUCH faster than in the device. For example, loading and displaying a sequence of full-screen UIImages (like a rudimentary video) can look very smooth in the simulator, and choppy on a device.
In fact, remember that there is a huge speed difference between different devices. The original iPhone and the iPhone 3G are slower than the iPod touch 2nd Gen, which is also much slower than the iPhone 3GS, and so on.
When trying to access the UIDevice.currentDevice(), it returns iOS Simulator instead of the actual device you're testing. This sucks, since you can't do certain things on the simulator.

Categories