Color-depth in iPhone simulator - iphone

I made a promotional video for my Exoplanet app and just noticed that the color-depth on the iPhone simulator is much lower than on the physical device. The visualizations look splendid on the actual phone but rather cheap on the simulator. I mainly notice this when using textures in OpenGL and CoreAnimation.
Have a look at this short video to see what I mean. (It has nothing to do with the recording.)
Am I doing something wrong? Can I increase the color-depth of the simulator?

Related

Can I scale my iPAD app to fit on a iphone automatically

I have an iPad application built in objective-c i wanted to know if there were any tricks with UIWindow to scale the app to run on an iphone in landscope mode. So that all my assumptions of the frames being ipad size would be ok but it would fit on an iphone. Just to carry around pocket demos etc. I realize things would be hard to press.
No, you have it backwards: apps made for the phone can run on the ipad unaltered in either 2x mode (stretched) or in a letterbox.

glDrawElements incorrectly renders points on iPhone

I am fairly new to OpenGL development on iOS. I'm working on software that will create 3D reconstructions of objects in the form of *.ply files. I'm trying to make an iOS app to visualize these simple vertex-only *.ply files. Everything works as intended on the iPhone and iPad Simulator, but when I run it on my iPhone, the points rendered in the view are glitchy and covered with large squares. Here's the comparison: iPhone and simulator. Has anyone run into similar issues with OpenGL?
It's important to understand that when running OpenGL ES code on the simulator, you're actually running it on the simulator's software implementation and not on the GPU.
The simulator's implementation is close to, but not identical to the implementation on the device GPU. This means that faulty code may render fine on the simulator. I've experienced it myself on a couple of occasions, like when using glbuffers and not allocating enough storage.
It's obviously hard to say where your code goes wrong, but I'd suggest you to go through your code and look for subtle errors.

Flash iPhone app pixelated once displayed on iPhone4

I have made the app in the larger resolution for iPhone4, and when I test it in 3GS it works/looks fine, the app scales DOWN accordingly. But when I test app on iPhone 4 it appears for reason to scale it down and then back up again creating a pixelated look. This applies to vector assets and text within the flash project as well which is even weirder in my opinion. My SWF is 640x920px.
According to this post support for retina-display graphics is not yet supported. The two output sizes are either low-resolution iPhone or iPad. To wit:
The current version of PFI produces apps that run at 320x480 on iphones and ipods or 1024x768 on ipads. Support for 964x480 is something we're looking into adding for a future release.

Can the iPhone simulator handle PVR textures?

I have a really weird problem with PVR textures on the iPhone simulator- the framerate falls through the floor on the iPhone simulator, but on the iPhone itself it works just fine. Has anyone had any experiences similar to this? I'm using SDK 3.1.2
The iPhone simulator is known to be extremely slow for certain rendering scenarios.
One especially bad case we experienced was that creating (glGenTextures+glTexImage2D) and destroying (glDeleteTextures) a big texture (title screen) will kill all performance until a hard simulator restart.
Testing your "logic" on the simulator and "assuming" it will be fast on the device has been working pretty good for us.
As always: Try avoiding state changes!
And yes: PVR textures are supported, but simulated in the shader.
Yes, simulator supports PVRTC textures, but probably does decompression into RGB format during upload or when texture is used for the first time. In my game it causes big slowdown until every PVR texture is shown at least once. Of course, those slowdowns appear in simulator only and on actual device PVRTC textures are very very fast.
The iPhone simulator is notorious about being almost like the iPhone.
I cannot speak about PVR textures, as I am not sure. But, from other things I have done (and from what I read on the internet) most developers give up on the simulator rather quickly due to its minor differences from the real thing.
In the end if it works on the iPhone, then the simulator doesn't matter.

iPhone device vs. iPhone simulator

I have heard of apps not working properly on the simulator but working properly on the actual iPhone device. Has anyone experienced an app that runs perfectly in the simulator but not on the actual iPhone device?
Filenames are case-sensitive on the iPhone, but not in the simulator.
So, for example, if you try to load an image with UIImage *iconImage = [UIImage imageNamed:"MyIcon.png"], but your resource is actually named "myicon.png", then it will work on the simulator, but not on the device.
If your app is graphics intensive, like say a game, the performance of the simulator DOES NOT resemble at all that of the hardware. Your application will probably be smooth and work great on the simulator, but on hardware it'll likely render at a crawl unless you know what your doing. You can easily go from 60fps to 3fps between Simulator and hardware.
The order in which function/constructor parameters are evaluated is different:
int i = 0;
int f() { return ++i; }
int a, b;
int test(int p1, int p2) {
a = p1;
b = p2;
}
test( f(), f() );
//simulator: a = 2, b = 1
//device: a = 1, b = 2
Trigonometry functions may return different results:
float a = cosf( 0.108271248639 );
printf("%.12f", a);
//simulator: 0.994144439697
//device: 0.994144380093
I know there are some differences in the OpenGL ES implementation between the device and the simulator. From what I understand this is mainly because of the graphics chip on the iPhone (PowerVR MBX) having vastly different capabilities than other mac machines. Many of the hardware limits are not enforced in the simulator, so it is entirely possible to get something running in the simulator that will totally crash on the device.
There are also some OpenGL ES extensions that are supported by the iPhone hardware that are not supported in the simulator. I believe the major one is PowerVR texture compression (PVRTC).
Another problem area can be memory footprint. Anecdotally, I have not seen the simulator automatically enforce the memory limitations of the device. Therefore, it is possible to have something that runs in the simulator fine, happily consuming copious amounts of RAM and never bothering to free any of it only to be swiftly terminated after a short continuance of such behaviour when running on a device.
There are certain bits of code that won't work on the simulator (using the iPhone Keychain, for example), but for almost all applications, the simulator will work exactly like the iPhone.
That said, there's absolutely no replacement for testing on a real device.
I had a problem running a relatively simple 1/30 sec timer to do updates for a game. It runs fine in the simulator, and freezes out input on the device.
Also note that you will be compiling against the OS X frameworks (where applicable) when building for the simulator so you could be using methods and classes that aren't available on the iPhone versions of the frameworks.
One example I can think of off the top of my head is NSPredicate. I was able to compile and run an app using NSPredicate in the simulator, but it wouldn't compile for the device since that class isn't available.
Fingertips are larger than the 1 pixel endpoint of a mouse cursor. To do proper, even minimal, usability testing you should deploy your App to a device.
If you enable GCC_ENABLE_FLOATING_POINT_LIBRARY_CALLS, your app will crash all over the place in the simulator but work on the iPhone.
Quartz graphics calls in the iPhone simulator are faster than Java2D calls on the same computer--wicked fast.
I've had issues in memory-hungry applications where the Simulator would work just fine (because it would assume the iPhone/iPod Touch's memory was all yours to play with), while the device would crash (because other apps had leaked and Apple background services had eaten up some memory) and I hadn't implemented proper memory management or a response to the didReceiveMemoryWarning selector.
One big thing that took me a while to spot was that the simulator does not support device tokens, so any code that involves those will not run on the simulator.
I had a bug where the app would work fine on the simulator, but would crash when I ran it on a device because there was a bug in the device token code. I couldn't figure it out for the longest time!
There are many trivial examples. For example you can allocate far more memory on the simulator than on a real phone. You cannot receive push notification on the simulator if you don't have a retina Mac, the display dot pitch is different.
At a more fundamental level, the simulator is just that, it simulates the iPhone OS X using Mac OS X. This is evidenced by the fact that the filesystem on the simulator may not be case sensitive but on the phone it will be. More subtly, it does not emulate hardware so things like location will not work the same and 3D is going to be very different - especially if you are using Metal.
You should always test on real hardware.
Without considering the performance differences between the two, there used to be some things that the simulator didn't do correctly - i.e. it would mess up audio in some cases (see this question). However since the 2.2 SDK this issue has been resolved and the sound seems to be fine in the simulator. That's not to say that there is some other incompatibilities lurking down there! (Just none I've run into)
Regarding sounds, I was having the same problem. The issue was that the sound encodings that the Simulator supports is a different set of sounds than the device. I hope that helps.
I had many problems with libraries and frameworks when moving from the simulator to the device. Not least that they seem to have different architectures!
I have seen the positioning of objects, like toolbars be different on simulator than on the phone. Very annoying.
Yeah....
Apps compiled for 2.x will work fine on 3.0 device, but it will crash on 3.0Simulator
Note: 1. If you compile for 3.0, app will work fine on 3.0 simulator also...
2. a)Compile for 2.x and launch the app in simulator.
b)Now change the iPhone Simulator Hardware to "3.0".
c)Then launch the app we installed earlier in step a).
CRASH !!!!!!!!
movie file (m4v type) as my exprience is first time playing properly
but at second time it flickers screen of simulator...
whereas in iPhone device it works fime...
I had some sound effects that played fine in the simulator, but not on the device. I had to change the format to something that the device would handle.
If status bar of application is hidden,In case of simulator it still consumes touch event. But in the device it behaves perfectly.
Yes - it happened to me the other day. I'm new to the iphone, and so had deleted MainWindow.xib thinking it was unused. The app worked perfectly on the simulator - but crashed when installing on the phone.
Another issue we ran into was our three20 dependencies, which were set to ios 3.2 instead of 4.1. Worked perfectly in the simulator, but bombed on the device (since the files were compiled for the wrong arch).
iphone video library is not accessible
in simulator but code work fine on
actual device
Resource loading in the simulator is MUCH faster than in the device. For example, loading and displaying a sequence of full-screen UIImages (like a rudimentary video) can look very smooth in the simulator, and choppy on a device.
In fact, remember that there is a huge speed difference between different devices. The original iPhone and the iPhone 3G are slower than the iPod touch 2nd Gen, which is also much slower than the iPhone 3GS, and so on.
When trying to access the UIDevice.currentDevice(), it returns iOS Simulator instead of the actual device you're testing. This sucks, since you can't do certain things on the simulator.