We have this project using OpenCV and at first we developed for iPad2.
Everything ran smooth and an computer vision object recognition iteration was taking a little under 1 second.
So far so good. Now we are testing the app for iPhone on both 4 and 4S. Of course we did our research as the results were stating the iPhone4S performance was almost as fast as the iPad2.
The results of the iPhone 4 are terrible, one iteration takes 15 seconds. In the iPhone 4S on iteration takes 8 seconds.
So with our algorithms:
iPhone4 is 15x slower than an iPad2
iPhone4S is 7-8x slower than an iPad2
Does anybody know if this is true? Is there something the iPhone is doing differently than the iPad2? Isn't the processor of the same type?
Anybody who can point us in the right direction?
Similar processors, different camera's. The iPad's camera has less MP's than the iPhone's one. On the iPhone, make sure you downsize your image to an acceptable size before processing.
my question is: with the simulator, is possible to simulate the other type of iphone?(3g, 3gs) or only iphone 4?
Thanks
iPhone Simulator only simulates a small portion of the software. It does not emulate hardware at all (otherwise it would have been called iPhone Emulator).
The only way to test your apps on real-world hardware constraints is to get a paid program account and install your apps on real devices.
Because the hardware differences between the iPhone 3G and the 3GS doesn't affect in the test of applications. The simulator doesn't limit memory usage or CPU speed based on what hardware you've selected, the accelerometer cannot be simulated, so the only difference of consideration is the screen size, that changes between iPhone 4 and 3G/3GS, but not between 3GS and 3G.
Good luck!
No, it's not possible. If you try to detect device type on simulator with the following code it'll never return you any version except 'iPhone Simulator'.
#import <sys/utsname.h>
+ (NSString *)getDeviceType {
struct utsname systemInfo;
uname(&systemInfo);
return [NSString stringWithCString:systemInfo.machine
encoding:NSUTF8StringEncoding];
}
I am developing an augmented reality iOS app (iPhone/iPad/iPod) and I would like to have a list of devices where this feature is supported.
I mean the compass and the geomagnetic field so that I can get the orientation of the device in all degrees of freedom.
Thanks!
The compass is available on all iPads and on the iPhone 3GS and 4.
I was also wondering if there are any devices left capable of running iOS7 or iOS8 without compass feature and a quick research pointed out the following:
Compass available:
on any iPhone 3GS or later. iOS7 support from iPhone 4 or later, iOS8 support from 4S or later.
on any iPad. iOS7 and iOS8 support starts with iPad 2 or later.
Compass not available:
on any iPod touch. 5th generation (current model as of 2015) does support iOS7 and iOS8
So if you design an application for iOS7 or iOS8 and only for the iPhone/iPad there will be heading available. For the iPod touch not.
Compass does not provide orientation in all degrees of freedom, although you can get that data from the orientation sensors.
You should also look at the gyroscope (iPhone4 and iPod touch 4 only so far), which provides much more accurate and faster orientation sensing in all degrees of freedom, see UFO on Tape for an excellent example.
I have heard of apps not working properly on the simulator but working properly on the actual iPhone device. Has anyone experienced an app that runs perfectly in the simulator but not on the actual iPhone device?
Filenames are case-sensitive on the iPhone, but not in the simulator.
So, for example, if you try to load an image with UIImage *iconImage = [UIImage imageNamed:"MyIcon.png"], but your resource is actually named "myicon.png", then it will work on the simulator, but not on the device.
If your app is graphics intensive, like say a game, the performance of the simulator DOES NOT resemble at all that of the hardware. Your application will probably be smooth and work great on the simulator, but on hardware it'll likely render at a crawl unless you know what your doing. You can easily go from 60fps to 3fps between Simulator and hardware.
The order in which function/constructor parameters are evaluated is different:
int i = 0;
int f() { return ++i; }
int a, b;
int test(int p1, int p2) {
a = p1;
b = p2;
}
test( f(), f() );
//simulator: a = 2, b = 1
//device: a = 1, b = 2
Trigonometry functions may return different results:
float a = cosf( 0.108271248639 );
printf("%.12f", a);
//simulator: 0.994144439697
//device: 0.994144380093
I know there are some differences in the OpenGL ES implementation between the device and the simulator. From what I understand this is mainly because of the graphics chip on the iPhone (PowerVR MBX) having vastly different capabilities than other mac machines. Many of the hardware limits are not enforced in the simulator, so it is entirely possible to get something running in the simulator that will totally crash on the device.
There are also some OpenGL ES extensions that are supported by the iPhone hardware that are not supported in the simulator. I believe the major one is PowerVR texture compression (PVRTC).
Another problem area can be memory footprint. Anecdotally, I have not seen the simulator automatically enforce the memory limitations of the device. Therefore, it is possible to have something that runs in the simulator fine, happily consuming copious amounts of RAM and never bothering to free any of it only to be swiftly terminated after a short continuance of such behaviour when running on a device.
There are certain bits of code that won't work on the simulator (using the iPhone Keychain, for example), but for almost all applications, the simulator will work exactly like the iPhone.
That said, there's absolutely no replacement for testing on a real device.
I had a problem running a relatively simple 1/30 sec timer to do updates for a game. It runs fine in the simulator, and freezes out input on the device.
Also note that you will be compiling against the OS X frameworks (where applicable) when building for the simulator so you could be using methods and classes that aren't available on the iPhone versions of the frameworks.
One example I can think of off the top of my head is NSPredicate. I was able to compile and run an app using NSPredicate in the simulator, but it wouldn't compile for the device since that class isn't available.
Fingertips are larger than the 1 pixel endpoint of a mouse cursor. To do proper, even minimal, usability testing you should deploy your App to a device.
If you enable GCC_ENABLE_FLOATING_POINT_LIBRARY_CALLS, your app will crash all over the place in the simulator but work on the iPhone.
Quartz graphics calls in the iPhone simulator are faster than Java2D calls on the same computer--wicked fast.
I've had issues in memory-hungry applications where the Simulator would work just fine (because it would assume the iPhone/iPod Touch's memory was all yours to play with), while the device would crash (because other apps had leaked and Apple background services had eaten up some memory) and I hadn't implemented proper memory management or a response to the didReceiveMemoryWarning selector.
One big thing that took me a while to spot was that the simulator does not support device tokens, so any code that involves those will not run on the simulator.
I had a bug where the app would work fine on the simulator, but would crash when I ran it on a device because there was a bug in the device token code. I couldn't figure it out for the longest time!
There are many trivial examples. For example you can allocate far more memory on the simulator than on a real phone. You cannot receive push notification on the simulator if you don't have a retina Mac, the display dot pitch is different.
At a more fundamental level, the simulator is just that, it simulates the iPhone OS X using Mac OS X. This is evidenced by the fact that the filesystem on the simulator may not be case sensitive but on the phone it will be. More subtly, it does not emulate hardware so things like location will not work the same and 3D is going to be very different - especially if you are using Metal.
You should always test on real hardware.
Without considering the performance differences between the two, there used to be some things that the simulator didn't do correctly - i.e. it would mess up audio in some cases (see this question). However since the 2.2 SDK this issue has been resolved and the sound seems to be fine in the simulator. That's not to say that there is some other incompatibilities lurking down there! (Just none I've run into)
Regarding sounds, I was having the same problem. The issue was that the sound encodings that the Simulator supports is a different set of sounds than the device. I hope that helps.
I had many problems with libraries and frameworks when moving from the simulator to the device. Not least that they seem to have different architectures!
I have seen the positioning of objects, like toolbars be different on simulator than on the phone. Very annoying.
Yeah....
Apps compiled for 2.x will work fine on 3.0 device, but it will crash on 3.0Simulator
Note: 1. If you compile for 3.0, app will work fine on 3.0 simulator also...
2. a)Compile for 2.x and launch the app in simulator.
b)Now change the iPhone Simulator Hardware to "3.0".
c)Then launch the app we installed earlier in step a).
CRASH !!!!!!!!
movie file (m4v type) as my exprience is first time playing properly
but at second time it flickers screen of simulator...
whereas in iPhone device it works fime...
I had some sound effects that played fine in the simulator, but not on the device. I had to change the format to something that the device would handle.
If status bar of application is hidden,In case of simulator it still consumes touch event. But in the device it behaves perfectly.
Yes - it happened to me the other day. I'm new to the iphone, and so had deleted MainWindow.xib thinking it was unused. The app worked perfectly on the simulator - but crashed when installing on the phone.
Another issue we ran into was our three20 dependencies, which were set to ios 3.2 instead of 4.1. Worked perfectly in the simulator, but bombed on the device (since the files were compiled for the wrong arch).
iphone video library is not accessible
in simulator but code work fine on
actual device
Resource loading in the simulator is MUCH faster than in the device. For example, loading and displaying a sequence of full-screen UIImages (like a rudimentary video) can look very smooth in the simulator, and choppy on a device.
In fact, remember that there is a huge speed difference between different devices. The original iPhone and the iPhone 3G are slower than the iPod touch 2nd Gen, which is also much slower than the iPhone 3GS, and so on.
When trying to access the UIDevice.currentDevice(), it returns iOS Simulator instead of the actual device you're testing. This sucks, since you can't do certain things on the simulator.
Can one be simulated by periodicly syncing with GPS and in the meen while working with the accelerometer? I know for example that the N95 accelerometer is invarient to rotation on the Y axis (while beeing face up/down).
The original iPhone and the iPhone 3G use GPS to calculate the heading, however the iPhone 3GS now has a 3-dimensional magnetometer compass in it.
This can only be done taking two GPS coordinates (while moving) and determining the direction from point A to B.
iPhone doesn't have a built in compass; but there is one created in software. It's called Compass Free, and unsurprisingly perhaps, it's free.
Extra info: The IPHONE 1 did not have GPS or compass.